Evil AI

Date: 2021-08-20 12:26 pm (UTC)
channelpenguin: (Default)
From: [personal profile] channelpenguin
Yes, evil, given. But... does this mean that self-reported race IS in fact An Objective Thing. Which has been generally regarded as "not" up until now (as far as I understand - I could be woefully ignorant)

Re: Evil AI

Date: 2021-08-20 03:41 pm (UTC)
hilarita: stoat hiding under a log (Default)
From: [personal profile] hilarita
Almost certainly not. What's most likely is that the AI is picking up on stuff in the image metadata, image quality, or a combination of things that associate strongly with self-reported race. But not an obvious combo of things, otherwise this would have been able to be picked out by human researchers. If there really were such strong correlations with race, previous studies would have picked that up (especially ones done in the last 10-20 years, when we've tended to have a reasonable-ish set of standards about how we run clinical trials, including ensuring we have enough statistical power etc.). If Hispanic white people were different from other white people at a cellular level, we'd know about it. (Not least because this would be leapt upon by every racist shitbag as justification.) And for most medical imaging, as I understand it, that cellular response is what you're looking for - the contrast differences shown in the meniscus tear, broken bones, pneumonia, etc.

AI has a notorious failure mode where it picks something random and learns the wrong thing. One famous example is that an AI looking for cancerous tumours learnt not to recognise cancer, but that images of cancer tend to have a measurement strip in the frame more often than images that weren't of cancer. This one was fairly easy to spot, because you could look at which bit of the image the AI was weighting in its response, and spot that it wasn't the bit with the tumour in it.

Now in this study, they've picked all the obvious ways of checking whether your AI has fixated on something unhelpful, and they've not found them. That suggests it's a lot of small things contributing to the overall impression the AI is getting. And some of that information is going to be an artefact of the sociological context of collecting and labelling the image. Unless they've made a very basic mistake, there should be no way the AI can deduce this. It suggests that racial bias is built-in to medical imaging. It's a strong claim, and I'm sure there'll be a bunch of stuff to challenge this.

Btw, while we currently regard race as not an objective thing, this was by no means true 100 years ago. A lot of the post-WWII scientific work has been unpicking scientific racism, and doing better statistics on confounding variables (mostly, let's face it, poverty and discrimination).

Re: Evil AI

Date: 2021-08-20 04:33 pm (UTC)
channelpenguin: (Default)
From: [personal profile] channelpenguin
Thanks for that. Most detailed. I wonder what it could be that DOES correlate? But the researchers themselves haven't been able to work it out. I hope someone does...

Date: 2021-08-20 03:48 pm (UTC)
mountainkiss: (Default)
From: [personal profile] mountainkiss
If your penultimate article is right then I don't think your fourth headline can be fair?

Date: 2021-08-20 05:39 pm (UTC)
calimac: (Default)
From: [personal profile] calimac
Yes, but you have to read that penultimate article (or at least notice the "sexwork" tag) to realize that. Fortunately I did both.

So why didn't OnlyFans blast the true responsibility from the hilltops? Or did they and nobody paid attention?

Date: 2021-08-20 06:16 pm (UTC)
dewline: Text - "On the DEWLine" (Default)
From: [personal profile] dewline
Items 4, 5 and 8 all being directly or thematically linked...we are looking at a horror story that's been ongoing for generations, since before internet in general was a thing that people could use. Dreamwidth's co-founder - linked via item 8 - has named the factors in play here.

Also, looking at this related item at Vice News, I've been introduced to another acronym: "SWERF".
Edited Date: 2021-08-20 06:29 pm (UTC)

August 2025

S M T W T F S
      1 2
3 4 5 6 7 8 9
10 11 12 1314 15 16
17 18 19 20 21 22 23
24 25 2627282930
31      

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Aug. 27th, 2025 02:35 am
Powered by Dreamwidth Studios