Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)NE
Posts
0
Comments
333
Joined
2 yr. ago

  • Currency is a natural evolution of commerce. Direct barter only works if the person selling what you need wants something you have.

    Say you want to buy flowers. If the florist wants shoes and you only have bread or hammers to spare, then tough luck.

    Any large society cannot function with such a clunky way to exchange goods/services. Currency is merely a proxy that allows both sides to trade their goods using a tool they both value similarly. Hell, some civilisations used giant boulders as currency... it's hardly a new concept.

  • Depends on the model and your settings I guess. I use a Fenix5, and it lasts like 18-20 days with a run every second day (tho I keep pulse ox off). Could be more if I had gotten a solar one, but those were pricey back when I got mine...

  • As long as russia can keep pushing forward, they won't care about losses. Even by UAF's own admission, they will probably be forced to abandon Avdiivka in 2 months or so at the current rate of things.

    The real important question is whether the western political climate turns for the better or worse. If Ukraine would get all the aid it needs and start pushing back, that'd be a whole different situation; but if the aid ebbs, this can turn into another finnish winter war where Russia gets away with annexing a bunch of territory.

  • There's a lot of cruelty potential too. In FNAF Security Breach, you can cripple a miniboss by ripping out her eyes, and you can listen to her lament the fact afterwards. Following on that idea, imagine how many gamers would use AI controlled characters to abuse them in creative ways if they reacted properly. Ooh, I can even chop the legs off!

  • LLMs don’t do this though, it doesn’t do a lookup of past SAT questions it’s seen and answer it, it uses some process of “reasoning” to do it.

    The "reasoning" in LLM is literally statistical probability of which word would follow which word. It has no real concept of what it talks about beyond the pre-built relationship matrices between words and language rules. That's why LLMs confidently hallucinate obvious bullshit time to time - to them there's no meaning to either truthful or absolute bonkers text, it's just words that should probably follow each other.