Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)VR
Posts
0
Comments
1,053
Joined
2 yr. ago

  • yes it is, and it doesn't work.

    edit: too expand, if you're generating data it's an estimation. The network will learn the same biases and make the same mistakes and assumtlptions you did when enerating the data. Also, outliers won't be in the set (because you didn't know about them, so the network never sees any)

  • no need for that subjective stuff. The objective explanation is very simple. The output of the llm is sampled using a random process. A loaded die with probabilities according to the llm's output. It's as simple as that. There is literally a random element that is both not part of the llm itself, yet required for its output to be of any use whatsoever.

  • again, whoosh. you missed the part where you train me before asking the question. Then i can extrapolate. And I need very few examples, as little as 1.

    I'm talking from the perspective of having actually coded this stuff, not just speculating. A neural network can interpolate, but it sure as hell can't extrapolate anything that was not in its training.

    Also, as a human, I can also train myself.

  • ok but that still entails trying random things until i find it. If I didn't already know it was a builtin i wouldn't know to search there. The bash thing was just an example. I have learned this stuff since i encountered the problem. This is just me recollecting my experience of trying to use man

  • I've met someone employed as a dev, who not only didn't know that the compiler generates an executable file, but actually spent a month trying to change the code, not noticing that 0 of their code changes were having any effect whatsoever (because they kept running an old build of mine)

  • You point to me and tell me this is a bike. If we go around it 90 degrees and you ask me what it is, I can still tell you it's a bike, even though I don't know what one does or is used for. absolutely none of what you mentioned. i need no context. I only need to be able to tell that you pointed to the same object the second time even though I'm viewing it from a slightly different angle.

    You point and say "this is a bike", we walk around it, you point again and ask me "what is that?" I reply "a bike.... you've just told me!"

    Neural networks simply can't do that. It won't even recognize that it is the same object if it wasn't specifically trained to recognze it from all angles. You're talking about a completely different thing, which I never mentioned.

  • thanks for the advice. I knew about the search feature, but sometimes the stuff you need isn't even on the page. I have no idea how to find what I need when it's not in "man cmdname" how am I supposed to know that the feature i want has a dedicated page?

    how could I find certain commands if i didn't already know it was a shell builtin and not a command? It's not like you get a manpage saying "this is not a command". And even if i did have the idea to open the bash page, it's still useless, because builtins are their own dedicated page. That sort of stuff. It rarely ever makes things easier for me.

    edit, it is occasionally useful phen I have already found what I want on google and just want some more in depth details.

  • That's why you show him one, before asking what that same car viewed from a different angle is.

    I had never seen a recumbent bike before. I only needed to see one to know and recognize one whenever I see one. Even one with a different color or make and model. The human brain definitely works differently.