Skip Navigation

Jojo, Lady of the West
Jojo, Lady of the West @ Silentiea @lemmy.blahaj.zone
Posts
1
Comments
745
Joined
1 yr. ago

  • Okay, so I don't follow politics all too closely, and I don't live in or near Pennsylvania, and the person that came to mind when I read "Shapiro" was "Ben" since I'd never heard of "Josh".

    This confused me. Now I know more.

  • It is. The actual test for humans there isn't the fact that you clicked the right squares, it's how your mouse jitters or how your finger moves a bit when you tap.

  • Lol I love how this article censors the picture of the guy smoking because "think of the children" and"we must preserve the magic* or whatever, but leaves Minnie's severed head just sitting on the ground.

  • I mean, Chemical Wonka and Rhynoplaz, it says it right there

  • Rule

    Jump
  • I just love how the last part parses.

    They're rapists. And some (of those rapists?), I assume, are good people.

    It's ridiculous, but also he probably does think some rapists are good people.

  • Rule

    Jump
  • That circle is coincident with Shakespeare.

  • I mean you can. Ocean gate was a thing, I'm sure there's others.

  • I mean, yes?

    That's very pithy, but the material used as training data was probably produced by artists attempting to create art using tools (ai and otherwise), as well as more mundane data designed and produced by humans with no ai tools and some produced by humans with almost exclusively ai tools.

  • You can't really have it both ways.

    Is the things just a machine that's following instructions and synthesizing its training data into different things? Then it's a tool.

    Is the things making choices and interpreting your inputs to produce a result? Then it's an artist.

    The painter I buy a commission from is an artist. The ai I use to generate a scene is a tool.

  • As I understand it, the biggest struggle with such panels is that it prevents heavy machinery from working the crop

  • You reduce some of the complexity and unpredictability by introducing an explanation for these changes of world state

    My concern is that "consciousness" isn't so much an explanation as it is a sort of heuristic. We feel conscious and have an internal experience, so it seems pretty reasonable to say that such a thing exists, but other than one's own self, there's no point where it is certain to exist, and no clear criteria or mechanism that we can point to.

    What about the p-zombie, the human person who just doesn't have an internal experience and just had a set of rules, but acts like every other human? What about a cat, who apparently has a less complex internal experience, but seems to act like we'd expect if it has something like that? What about a tick, or a louse? What about a water bear? A tree? A paramecium? A bacteria? A computer program?

    There's a continuum one could construct that includes all those things and ranks them by how similar their behaviors are to ours, and calls the things close to us conscious and the things farther away not, but the line is ever going to be fuzzy. There's no categorical difference that separates one end of the spectrum from the other, it's just about picking where to put the line.

    And yes, we have perhaps a better understanding of the mechanism behind how an ai gets from input to output than we do for a human, but it's not quite a complete one. And the mechanism for how humans get from an input to an output is similarly partially understood. We can see how the arrangement and function of nerve cells in a "brain" lead to the behaviors we see, and have even fully simulated the brains of some organisms with machine code. This is not so dissimilar from how a computational neural network is operated. The categorical difference of "well one is a computer" doesn't work when we have literally simulated an organic brain also on a computer.

  • assuming they do have a conscience simplifies your world model.

    Does it? Feels more like it merely excludes them from your model, since your model cannot explain their conscience. If that simplifies your model, then you can apply the same thinking to anything you don't understand by simply saying it is similar to something else you also can't explain.

    The other important bit is that not assuming some phenomenon exists (e.g., "AI" can experience emotions) unless proven otherwise

    The problem with this isn't that it's literally unprovable, it's that proving it requires defining "can experience emotions" in a way everyone can agree on. Most trivial definitions that include everything we think ought obviously be included often bring in many things we often think ought be excluded, and many complicated definitions that prune out the things we think ought be excluded, often also cut out things we think should be included

  • What about a cat, or a person who's different from you? It's just as impossible to prove, and yet...

  • Probability rule

    Jump
  • Under the assumption that at least one of those answers is "correct" and fixed (not changing with respect to guesses) and that all options are listed, it is either 25 or 50 percent. 25 if the "correct" answer is either 50 or 60 (or exactly one of the 25's), and 50 if the correct answer is 25.

    Clearly this requires a departure from the expected definitions of the words and/or numbers in the question. It's terrible question writing, I'll say.

  • That's unprovable without some very strict definitions, but if we take it as a given (and for the record I don't disagree, so we should) then that's why the ai isn't the artist. It's just a tool an artist could use. MS Paint isn't an artist either, and like ai neither are many of the people using it, but it still can be used to create art.

  • Whatever the artist using the AI tool is trying to express?

  • I think most would consider most tree roots to be "woody"