Date: 2025-05-07 07:25 pm (UTC)
jducoeur: (Default)
From: [personal profile] jducoeur

Thanks for that specific pointer!

Honestly, to me this feels like things have reached the point of moral panic. The query that they actually used is basically using ChatGPT as a glorified search engine, which is the thing it's actually good at, provided somebody actually checks the references.

(That is, LLMs often hallucinate, but links are links. If a human being follows and evaluates those links, this is more or less precisely the same as using Google for the process, just faster and easier, and likely to reduce the number of scandals showing up after the fact because somebody turns out to have a poisonous background. In their shoes, I might well have done exactly the same thing.)

Tempest in a bloody teapot, IMO -- there are plenty of problems with LLMs, but this sort of witch-hunt is counter-productive. Really, I think the only thing they're guilty of is failing to read the room...

This account has disabled anonymous posting.
(will be screened if not validated)
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

June 2025

S M T W T F S
1 2 3 4 5 6 7
8 91011121314
15161718192021
22232425262728
2930     

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 10th, 2025 02:46 am
Powered by Dreamwidth Studios