Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)OO
Posts
3
Comments
1,210
Joined
2 yr. ago

  • There's such a vanishingly small amount of things that are truly "new" in the 21st century. I'd say just about everything ever made in the last few hundred years hardly counts as "new" - it's just synthesis of things that came before, probably from nature if you go back far enough.

    Novelty truly comes from combining existing things in ways that haven't been done before. In this regard, palworld has done BRILLIANTLY, taking the best parts of some other games, putting them together far better than those other games, and getting something that's way more than the sum of parts.

    Palworlds real failing at the moment is simply that it's early access. The game is fantastic until you hit a middle point where content just falls flat. But, again, it's early access. If there's ever a thing to be written off during early access, it's not all the content being done.

  • The broad answer is, I'm pretty sure everything you've mentioned is possible, and you're right in that this is similar to how humans integrate new data. Everything we learn competes with and bolsters every bit of knowledge we already have, so our web of understanding is this ever shifting net of relationships between concepts.

    I don't see any reason these kinds of relationships can't be integrated into generative AI, they just HAVEN'T yet, and each time you increase how the relationships interact, you're also drastically increasing the size and complexity of the algorithm and model. I think we're just realizing that what we have now is OK, but needs to be significantly better before it's really mind blowing.

  • To be clear, stable diffusion isn't one model, it's the generation platform. From there, you have models that sit on top of it. Online generators can use any model, depending on how they're set up. Each model includes different training data, meaning different results from the same prompts, sometimes vastly.

    It's a bit like driving somewhere, having someone ask how you found the place, and saying your phone. Technically a correct answer, but they're probably looking for more specific answers, like GPS, or a map. Not trying to nit-pick, just giving a bit of information.

  • It's similar to being in the closet. When you come out as a trans person, you "come out of your shell" so to speak. As such, people who haven't are considered to be "eggs" still inside their shells.

  • Assuming you have a strong base password you aren't concerned with being broken, you can use that, followed by a unique identifier for what you're logging into, so every password is essentially the same, but also unique. Something like, translate the lyrics to a song (say without me by Eminem) to first letters and punctuations, 2tpggrto,rto,rto, and add the identifier.

    2tpggrto,rto,rto-goog 2tpggrto,rto,rto-faceb

    This is essentially how I manage my passwords that I want to actually remember. Just make sure you're not SUPER obvious with how you make the identifier, perhaps -g0og or -f4c3b0ok. And no, I don't use that song lol.