Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)CA
Posts
2
Comments
429
Joined
2 yr. ago

  • Yeah, maybe somebody can translate for you. I considered using something else, but it was already long and I didn't feel like writing out multiple loops.

    No worries. It's neat how much such a comparatively simple concept can do, with enough data to work from. Circa-2010 I thought it would never work, lol.

  • Republican Spain and the "Autonomous Administration of North and East Syria" AKA Rojava.

    Republican Spain had some communist factions too, but Rojava is explicitly built around a specific strain of anarchism, and is an "administration" instead of a government. I doubt it looks very anarchist in practice, but that's neither here nor there, and they're democratic enough the US has endorsed them in the past to Turkey's great displeasure.

  • The thing people always overlook is that these legacy systems are only still running because they're super important. Nobody's hiring a junior COBOL dev to maintain NORAD, and hopefully nobody's contemplating putting ChatGPT in charge either.

    The move if you want this kind of job is to learn a language that's not quite a dinosaur yet, and have 20 years experience in 20 years. Perl or PHP maybe.

  • At the simplest, it takes in a vector of floating-point numbers, multiplies them with other similar vectors (the "weights"), sums each one, applies a RELU the the result, and then uses those values as a vector for another layer with it's own weights (or gives output). The magic is in the weights.

    This operation is a simple matrix-by-vector product followed by pairwise RELU, if you know what that means.

    In Haskell, something like:

    layer layerInput layerWeights = map relu $ map sum $ map (zipWith (*) layerInput) layerWeights

    foldl layer modelInput modelWeights

    Where modelWeights is [[[Float]]], and so layer has type [Float] -> [[Float]] -> [Float].

    RELU: if i>0 then i else 0. It could also be another nonlinear function, but RELU is obviously fast and works about as well as anything else. There's interesting theoretical work on certain really weird functions, though.


    Less simple, it might have a set pattern of zero weights which can be ignored, allowing fast implementation with a bunch of smaller vectors, or have pairwise multiplication steps, like in the Transformer. Aaand that's about it, all the rest is stuff that was figured out by trail and error like encoding, and the math behind how to train the weights. Now you know.

    Assuming you use hex values for 32-bit weights, you could write a line with 4 no problem:

    wgt35 = [0x1234FCAB, 0x1234FCAB, 0x1234FCAB, 0x1234FCAB];

    And, you can sometimes get away with half-precision floats.

  • Off the top of my head, 2. One with no UN seat and one long gone, to be fair, but they still exist and are/were sovereign. You can't say either turned into totalitarianism.

    Maybe you could say they would have or will, but that's just your guess. I could say the same thing about liberal democracy and be equally as well supported.

  • Yeah, I kind of wish there was another word for the idea, because it's a bit confusing. I think originally it was "in reaction to progress".

    Big and small c conservative is sometimes used to make the distinction in commentary, with big C being the reactionary stances that are common in right-wing parties that call themselves "Conservative", but I don't think everyone gets that either.

  • I really, really hope AfD gets banned. Germany has strong anti-hate protections, but enforcement on small groups is avoided because they're too small and not worth the trouble/attention, and enforcement on large groups is avoided because they're too large and powerful, so it's barely useful.

    Canada often does the same thing.