Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)FE
Posts
6
Comments
4,125
Joined
2 yr. ago

    1. Waffle fries (always crispy and delicious)
    2. Curly fries (same)
    3. Tater tots (surprisingly good)
    4. Wedges, if properly spiced
    5. Good onion rings
    6. Shoestring, if not limp
    7. Bad onion rings
    8. Bad wedges
    9. Crinkle cut (zigzag) (usually unflavored and soggy from a bag, but can be better than soggy shoestrings)
    10. Bad shoestrings
    11. Sweet potato fries (always soggy and I don't like sweet potato very much)
  • Your “probably not” argument gets thinner every major AI update.

    Right, but I'm talking about whether they're already using it, not whether they will in the future. It's certainly interesting to speculate about it though. I don't think we really know for sure how good it will get, and how fast.

    Something interesting that's come up is scaling laws. Compute, dataset size, and parameters so far appear to create a limit to how low the error rate can go, regardless of the model's architecture. And dataset size and model size appear to require being scaled up in tandem to avoid over-/under-fitting. It's possible, although not guaranteed, that we're discovering fundamental laws about pattern recognition. Or maybe it's just an issue with our current approach.