Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)JO
Posts
16
Comments
214
Joined
2 yr. ago

  • I really think this narrative is counterproductive. It’s not like corporations produce greenhouse gasses because they think it’s fun. They’re doing it to produce goods that people want at the absolute minimal price possible.

    No corporation is going to choose more environmentally friendly practices out of the goodness of their own hearts unless those practices are cheaper.

    I didn't get past you contradicting yourself in the first three sentences. Sorry.

  • That's not true at all. There was one US cannabis outlet using vitamin E acetate as a diluent in error. It's lethal when heated and not a standard component of any e-cigarette anywhere at all.

  • You're agreeing with me but using more words.

    I'm more annoyed than upset. This technology is eating resources which are badly needed elsewhere and all we get in return is absolute junk which will infest the literature for decades to come.

  • They cannot be anything other than stochastic parrots because that is all the technology allows them to be. They are not intelligent, they don't understand the question you ask or the answer they give you, they don't know what truth is let alone how to determine it. They're just good at producing answers that sound like a human might have written them. They're a parlour trick. Hi-tech magic 8balls.

  • It will almost always be detectable if you just read what is written. Especially for academic work. It doesn't know what a citation is, only what one looks like and where they appear. It can't summarise a paper accurately. It's easy to force laughably bad output by just asking the right sort of question.

    The simplest approach for setting homework is to give them the LLM output and get them to check it for errors and omissions. LLMs can't critique their own work and students probably learn more from chasing down errors than filling a blank sheet of paper for the sake of it.

  • They're circular. If the text is too predictable it was written by an LLM* but LLMs are designed to regurgitate the next word most commonly used by humans in any given context.

    *AI is a complete misnomer for the hi-tech magic 8ball

  • Your "knowledge and skillsets" mean absolutely nothing if you are not prepared to share your sources. "Trust me" does not cut it.

    I'm a medical statistician and part of my job is teaching medics how to navigate the literature. You'll be delighted to know that there's always a massive chunk on Doll & Hill and the methods which arose from the fight to prove that smoking was killing people. Do not try to patronise me. If you cannot be bothered to write up why you think something is true, do not claim that it is true.

  • If you don't care enough to have compiled the evidence, how can you justify expressing a strongly held opinion on a public forum? Just spew out any old bullshit headline as long as it confirms your prejudices? Regardless of how many millions of lives are on the line?

    That's not good enough. If you make a serious claim, back it up with your sources, or just don't do it.