andrewducker: (Default)
[personal profile] andrewducker
A few days ago [personal profile] danieldwilliam sent me this thread (on their blog here) discussing the Scottish exams fiasco*.

What the thread points out is that the exam results have _always_ been a rubbish approach. They're a way of slicing the kids into layers of competence, so that employers and educators can skim off the layer they want. If everyone does well then the grades are adjusted to make the results fit a curve.

And, on top of that, judging everything you've learned in a course by a single letter makes no sense. A course may teach you dozens of skills, and to then give only one single letter to represent that is ludicrous.

My preferred method would be that you either understand something or you don't. Break a subject down into atoms of understanding. People then learn those, and build new ones on top of ones they've learned. This would have two effects: (1) you'd be able to see if the person had the actual skills an employer needed (e.g. needing long division, not caring about geometry) and (2) it would highlight places where teachers were trying to teach more complex knowledge on top of basic knowledge that the person didn't actually understand yet. ("Bob, you're trying to teach my kid long division, but he clearly hasn't understood basic division yet"). This was something I saw fairly regularly as a child - kids left behind as the classroom swept on, when a little more work on a few basics might have given them the chance to catch up. If they were lucky then a parent might help them get caught up.

I'm not convinced our current teaching setup would work well with this. But then I think our current teaching setup doesn't work for a ton of people**, and I'd like to see us look at ways of making it work for far more of us. I hate to think how many other people are let down by a system designed to mush everything into "This person, overall, understands this huge area better than 90% of other people." - a grade which seems useful for gatekeeping people moving on to the next level of the same subject, and not much else.

Edit: A grading system I would be comfortable with would be to mark people as "Can repeat by rote", "understands concepts", and "can explain concepts to others".

*No exams this year, so grades were given based on prelim exams and teacher predictions. But the exam authority moderated down the predicted grades for children from schools where exam grades were historically lower, effectively saying "You come from a poor area, so we don't believe that your results could be that good." The defense being that even after this downgrading, more people from those schools got good grades than they did last year - in other words the predictions were not reasonable. There are clearly a lot of unhappy people, and this is going to take a while to unpick. More details here. For clarity, this is not just a Scottish problem, the front pages of the papers today say that 40% of A level (English and Welsh exams at 18) results will be lowered.
**To give a personal example, I took three attempts to pass my English language exams aged 16. I have no idea why. I read voraciously. I understood English pretty much as well as I do now. But the exams were asking questions where I literally did not understand what they wanted me to write. Instead, measuring my understanding of specific points, making it clear what it was they were asking me to prove I understood, would have been a huge benefit to me.

Date: 2020-08-08 08:28 am (UTC)
cmcmck: (Default)
From: [personal profile] cmcmck
I was never good at exams so blossomed when I reached the sort of uni courses (ie MA and post MA) that let you do your own research.

The problem is how people in that position ever get that far.

Being dysnumeric means I have no maths qualifications for example which means that the roads not taken are vulcanology, seismology and geology which utterly fascinate me.

Date: 2020-08-08 08:56 am (UTC)
aliceinfinland: (Default)
From: [personal profile] aliceinfinland
Your suggested approach is similar to Specifications Grading, in the book of the same name which is getting popular with radical educators. Specs grading makes sense, if the purpose of the education system is to teach and not to reproduce inequality. It is easiest to implement for cumulative subjects like maths that are easy to break down into stages. You can use pass/fail with detailed formative feedback for the more holistic subjects. It's a satisfying way to teach. But then admissions committees and employers have to do their own comparison, instead of relying on a handshake from the previous level. All of this would require more workers than the people who run schools, universities and employers are willing to hire.

Date: 2020-08-08 09:42 am (UTC)
njj4: (Default)
From: [personal profile] njj4
I just heard about Specifications Grading a few weeks ago and it all sounds really interesting. I'm starting to think about how to incorporate it into my undergraduate maths modules - what with everything else that's going on at the moment I suspect I won't get around to it for the next academic year, but it's certainly something I want to try.

Date: 2020-08-09 06:38 pm (UTC)
mellowtigger: (hypercube)
From: [personal profile] mellowtigger
I think this is my first encounter with Specifications Grading. I think I like it. I would caution, though, against a rigid pyramid structure of higher skills that assume mastery of lower skills. Think of it more like a network of complementary skills, maybe. It's known that sometimes people (particular autistics) can master so-called higher skills without mastering so-called preliminary requirement skills.

Hand-drawn example of this effect from the doctor who diagnosed me:
http://www.deltaworld.info/media/events/conference.outandequal.autism.20031002/SquarePegs-20031002-Luskin.jpg

My high school English writing can be summarized by the frequent grade, "B- You strayed from the assignment." It suggests that my letter grade would have been higher based on the quality of my writing, but I failed to stay on the teacher-intended topic. I liked my topics more than theirs, apparently. :)

Date: 2020-08-08 09:38 am (UTC)
calimac: (Default)
From: [personal profile] calimac
Do you have any recollection what the inexplicable questions the English exam was asking you were?

Date: 2020-08-08 10:41 am (UTC)
mtbc: photograph of me (Default)
From: [personal profile] mtbc
My problem was English Literature. I was just fine at exercises like, read this first part of a story, now continue it, demonstrating that I did understand various things about what I had read, but what they wanted me to write about it in a critical essay I never could work out.

Date: 2020-08-08 07:44 pm (UTC)
naath: (Default)
From: [personal profile] naath
IML(I only did either English to GCSE)E Literature said "demonstrate that you understand Shakespeare" (I get ALL the rude jokes...) and language was "write a compelling story" ... and I'm terrible at creative writing, and tolerable at surface level critical analysis. But since I've never needed to do creative writing this has never actually been a problem. I'd have done better if I'd had to learn about the history of the darn language, but apparently that is a highly specialised course of university study. But I was by & large unfairly good at exams.

CIMA Exams

Date: 2020-08-08 10:16 am (UTC)
danieldwilliam: (Default)
From: [personal profile] danieldwilliam
More coherent thoughts anon.

But.

One of the things that CIMA exams are trying to assess is can you work out what the actual question is because real life business problems don't walk up and announce themselves by reference to the ciriculum.

Re: CIMA Exams

Date: 2020-08-08 10:29 am (UTC)
mtbc: photograph of me (Default)
From: [personal profile] mtbc
This reminds me that I noticed a progression from maybe 80's-era A-level maths where by the mid-90's it was easy to recognize what algorithm the question wanted you to apply, it was like previous ones with different numbers, but with older papers there was an initial, more interesting, step of first figuring out how to solve the question being asked, the kind of thing that places like Cambridge still tested via STEP.

Date: 2020-08-08 10:35 am (UTC)
mtbc: photograph of me (Default)
From: [personal profile] mtbc
I love the idea. Where I'm a bit confused is how well it works in reality. I've been puzzled in interview in having candidates who appeared to remember nothing of courses they did well in so I don't know how useful it is to capture in much detail that they did well if their mind subsequently returns to its previous spotlessness. Though I also managed to hire a guy who had scored A's in a series of undergraduate analogue electronics courses who later turned out to be unfamiliar with the concept of op amps. I'd love to be able to use detailed reports of what people know but have lost faith in such.

Date: 2020-08-08 11:00 am (UTC)
aldabra: (Default)
From: [personal profile] aldabra
... and at least you know they are capable of picking it up and using it, if they've done so before. I increasingly can't remember things I used to know, but I can google and go "oh yes" way faster than I can teach myself from scratch. Alas one feels it is deprecated to be seen to use google in job interviews, but that's how you'd do the actual job.

Date: 2020-08-08 07:49 pm (UTC)
naath: (Default)
From: [personal profile] naath
I feel seen (it is a skill though, being able to pull the salient info from the error and comprehend the stack overflow answer)

Date: 2020-08-09 07:18 am (UTC)
chess: (Default)
From: [personal profile] chess
Being able to easily search for things just like you would while actually doing the job is a major advantage of remote interviews...

Date: 2020-08-08 10:55 am (UTC)
aldabra: (Default)
From: [personal profile] aldabra
Yes.

Or, failing that, a system like music grades, where you pick up about a level a year in whatever subjects you're learning *but you don't have to do them continuously*. So 1-5 roughly map onto GCSEs, and 7 is A-level, and 10 is a degree, and 11 is a master's, and 27 is a PhD (just joking...). So I'd be level 1 Mandarin Chinese and level 5 French and level 7 Latin and ooo, joint honours are interesting. Anyway. Level 0 in music, but AFAICT nobody makes you start music at a particular age and keep with the programme.

I like the specifications stuff more, but it's explicitly obvious right now that they don't trust the teachers to do the grading, and the current one-size-fits-all, once, system of public examinations is already unmanageably complex. I used to do invigilation for secondary school exams, and my contemporaneous notes say things like "We had the physics International Baccalaureate exams this morning. Twelve kids; each had a multiple choice paper followed by a written problems paper. This required *41* timed announcements. (Because some were Higher and some were Standard, and some were normal time and some were 25% extra, and everyone gets a half hour warning and a five minute warning, and the written papers have five minutes reading time at the beginning, and no two papers were the same length. This isn't entirely true: the Higher multiple choice with extra time is the same length as the Standard written paper without, but these are sequential so it doesn't help.)" That gets more complicated still if anyone gets to use a laptop, or arrives late, or has a nosebleed, and there aren't enough rooms in the school to be doing it for every year group.

I think all the government actually needs is to be able to identify a tranche of adequately competent kids to go on to university and then keep the system running, and the exams they've got are by and large up to that. They don't need to find the best kids, and they'd rather have ones from rich backgrounds than poor backgrounds because they'll be less inclined to start thinking about inconvenient reforms if they've experienced the system as it is as working for them. And then they explain to that tranche of kids that they've been selected on MERIT and obviously the kids agree and don't look too closely, and the kids who aren't selected get told they weren't good enough and, again, don't look too closely.

Date: 2020-08-08 01:06 pm (UTC)
ninetydegrees: Art: self-portrait (Default)
From: [personal profile] ninetydegrees
"And, on top of that, judging everything you've learned in a course by a single letter makes no sense."

Absolutely. Breaking down a subject makes it easier for everyone to understand what a child has understood or not. Global grades can be awfully demotivating too, making it seem like you're not able to do anything well in that subject, which is rarely true.

"Can repeat by rote" "understands concepts", and "can explain concepts to others".

The problem with "can repeat stuff by rote" is that it mostly tells me whether a student is able to work on their own and whether they have help at home. I prefer giving a student a task and seeing how well they can do it. Grading can them comes down to a simple "not at all/still struggling/pretty well done". Now, if only I could provide individual help and more training to the students who need it...

Date: 2020-08-08 08:21 pm (UTC)
agoodwinsmith: (Default)
From: [personal profile] agoodwinsmith
I have lots of opinions about this, but they are all chaotic.

Previously I worked as an admissions advisor, so that means I saw every transcript of every person who applied to my institution. There are almost as many types of transcripts and grading systems as there are students. Not quite, but by gummy there's lots.

Admissions people and human resources people are looking for shorthand, and we also read between the lines. How many times did the person attempt a course; does the person load up with lots of similar courses because they do well in a particular thing; does the person have odd time gaps; are there odd things on the transcript.

It becomes a battle of attrition. As gatekeepers focus on a thing, educators develop cunning ways to manipulate a thing, to massage their students' thing into a better shaped thing. Eventually the thing measured no longer tells you anything useful, and the educators at the applied-to place stage a revolt and want admission to be based on "something real" like a portfolio or an entrance exam, and then the massaging begins on those new things.

Education isn't based on the apprenticeship model where the apprentice moves forward as they master certain skills. Education is an artificial way to reduce access to deliberately limited opportunities. It is about class and not about knowledge.

Date: 2020-08-08 08:59 pm (UTC)
From: [personal profile] mme_n_b
Your suggested approach is basically the three-page list of skills with ratings 0-5 for each that I currently get for my elementary-school (grades 1-5) kids. I agree, it would be nice to see it in the higher grades as well. As for tests - they don't just test your knowledge of English, but also whether you can understand the question, which is information important to the college admissions board.

Date: 2020-08-10 10:09 am (UTC)
danieldwilliam: (Default)
From: [personal profile] danieldwilliam
I like the idea of a more nuanced and atomised assessement.

However, counter-theory. The current system does an adequate job, cheaply.

I wonder if there is a question of value engineering and good enough is close enough.

I'm assuming that an assessment scheme such as you are proposing would be more costly to operate. By operate I mean not just the assessment process but the use of the assessment process. (This is an assumption not a fact. ) What is the overall system burden of a better assessment system?

Does the fact that someone has a bunch of Highers tell me if they are a great match for the admin job I have? Not especially, but it quickly and cheaply allows me to sort out potentially suitable candidates from not probably suitably candidates. Do I really want the cost of a) working through a more detailed assessment to see if there is a great match in aptiude and b) paying taxes for the overhead of more detailed, more costly assessments which don't add value in a hiring decision. There will always be exceptions but on average a quick and cheap assessement tool might get society what it needs in terms of matching suitable candidates to suitable jobs. Particularly where, if someone, isn't that good at the job they can be fired.

Similarly for entry to university. Does having 6 A's at Highers tell you that I'm going to be a great law student? Not extraordinarly well but it's cheap and if 10% of the class flunk out that might be okay.

Where more detailed assessements are valuable I suspect they are going to be very tightly tailored to a specific role in a specific organisation and / or valuable enough to be done by the recruiting organisation.

Improved assessment might be useful for student and more useful for some users of the assessment outcomes but (my guess for the sake of this discussion) is that the cost of providing them to a recruiting organisation and that recruiting organisation using them outweights the benefits.

So my follow up question is what 21st Century technology could you deploy to make the improved assessement structure cheaper to operate?

Date: 2020-08-10 10:17 am (UTC)
danieldwilliam: (Default)
From: [personal profile] danieldwilliam
Ah, okay, I see the distinction.

February 2026

S M T W T F S
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 262728

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Feb. 26th, 2026 05:02 pm
Powered by Dreamwidth Studios