Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)JE
Posts
0
Comments
114
Joined
2 yr. ago

  • Yes, people are being forced to use it if they want to, for instance, search using Google or Bing.

    As the parent comment suggested, or there's no way to opt out, currently.

    I'm glad you see value in it; I think the injection of LLM queries into search results I want to contain accurate results (and nothing more) a useless waste of power.

  • Yeah, what's the jokey parable thing?

    A CTO is at lunch when a call comes in. There's been a huge outage, caused by a low level employee pressing the wrong button.
    "Damn, you going to fire that guy?"
    "Hell no, do you know how much I just spent on training him to never do that again?"

    (

    </Blah>

    )

  • The only sad thing is that it seems like they're (still?) only talking about the other side's policy, rather than backing their own policy based on its strength.

    Just "I'm not the other side" (but at least with policy rather than personality)

    1. I think you're on the wrong community for this question.
    2. The thing regularly referred to as "AI" of late is more accurately referred to as generative AI, or large language models. There's no capacity for learning from humans, it's pattern matching based on large sets of data that are boiled down to a series of vectors to give a most-likely next word for a response to a prompt. You could argue that that's what people do, but that's a massive over simplification. You're right to say it does not have the ability to form thoughts and views. That said, like a broken clock, an LLM can put out words that match up with existing views pretty darn easily!

    You may be talking about general AI, which is something we've not seen yet and have no timeframe for existing. That may be able to have beliefs... But again, there's not even a suggestion of that being close to happening. LLMs are (in my opinion) not even a good indicator or precursor to that coming soon.

    TL;DR: An LLM (or generative AI) can't have or form beliefs.

  • I got a first generation badgy, and it had an issue that prevented it working with the battery.

    Sqfmi said they'd sent out a replacement part to fix it, but never got back to me.

    I love the ideas they have, but I don't trust them.

  • Another vote for Binging with Babish - though my interest waned when he started going from "hey, I could try making that!" to episodes requiring ever more complex and expensive niche machines (e.g. dehydrators), I completely lost interest around the time he started doing the "going round buying folk things" series. Never really got back into it, unsubscribed after a while.

    Bon Appetit was great, then everything happened, many folk changed (for good reason) and it just lost the appeal for me. I've watched some of the spun off channels, but some of the appeal for me was the interactions.

    I used to religiously watch everything Shut Up and Sit Down put out, but found myself watching less and less over the last few years - turns out, they changed primary content creators and editor (if I understand correctly) around that time, and announced that they did so recently. Still watch occasionally, but it's a very subtly different style that hits less reliably for me. May also be related to me managing to play fewer boardgames, lately.

  • That's just not how LLMs work, bud. It doesn't have understanding to improve, it just munges the most likely word next in line. It, as a technology, won't advance past that level of accuracy until it's a completely different approach.

  • Apologies, I thought I'd seen 60 seconds but since looking I've found a bunch of guesses from "every few" to numbers with nothing that looks like a source in headlines.

    Going to the source, I found:

    are taken every five seconds while content on the screen is different from the previous snapshot.

    Should have searched first, sorry!