Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)MA
Posts
4
Comments
2,060
Joined
2 yr. ago

  • I mean, no, not really. We know what thinking is. It's neurons firing in your brain in varying patterns.

    What we don't know is the exact wiring of those neurons in our brain. So that's the current challenge.

    But previously, we couldn't even effectively simulate neurons firing in a brain, AI algorithms are called that because they effectively can simulate the way that neurons fire (just using silicon) and that makes them really good at all the fuzzy pattern matching problems that computers used to be really bad at.

    So now the challenge is figuring out the wiring of our brains, and/or figuring out a way of creating intelligence that doesn't use the wiring of our brains. Both are entirely possible now that we can experiment and build and combine simulated neurons at ballpark the same scale as the human brain.

  • When I heard that line I was like "Yeah, sure. We'll never have AI in my lifespan" and you know what? I was right.

    Unless you just died or are about to, you can't really confidently make that statement.

    There's no technical reason to think we won't in the next ~20-50 years. We may not, and there may be a technical reason why we can't, but the previous big technical hurdles were the amount of compute needed and that computers couldn't handle fuzzy pattern matching, but modern AI has effectively found a way of solving the pattern matching problem, and current large models like ChatGPT model more "neurons" than are in the human brain, let alone the power that will be available to them in 30 years.

  • Lmao, the fuckepic and 'Tim Sweeney is a bastard man' sentiments were always wildly overblown.

    The fact that anyone took Apple's side in this case (because the Epic game store paid for a couple exclusive games to try and break into the market) is laughably childish.

    Apple literally rips off the entire world to the tune of billions of dollars a year through app store mafia extortion fees alone, let alone the rest of their anti-competitive bullshit.

    Epic was just trying to break into the Apple / Google / Steam monopolies and made a couple unpopular / anti-consumer business moves on a couple games (all the while taking afar smaller cut of profits than any other store), meanwhile Apple has based their literal entire multi-decade business model on anti-consumer choices and done that for every single hardware and software device they sell you.

    They are not remotely comparable. Epic was always fully in the right in their anti-monopoly legal battles.

  • This is literally lemmy, a (relatively) niche platform where somebody is asking about a (relatively) niche subject. I dont think anything about this is a average person.

    'Average person' was in quotes because it's the language you used to describe someone not comfortable with the command line.

  • I mean what's the point of "self" hosting then?

    If you have to be a professional server administrator to host one of these services, then why even have a self hosting community as opposed to just a hosting community for server admins to discuss how to set and configure various services? Is this community dedicated to just discussing the uniqueness of managing a home server without a static IP?

  • Self hosting is just an extension of open source software. It's only goal is being able to run your own backends of apps to not be exploited by major companies. It's goal is not to be a niche technical hobby, if that's your goal in its own right, then get a model train or a Warhammer set.

    Mainstream consumers don't know words "Plex" and "Home Assistant" either.

    Yes, they do lol. It's flat out weird to think that the only people who have ever heard of pirating are software developers and server admins who use the command line.

  • You're viewing this through an incredibly skewed lense. The average person will never even consider self hosting nor will care, if anything the average person prefers cloud services.

    The only lens I'm viewing this through is one that dares to imagine that the Venn diagram of "computer users savvy enough to care about privacy" isn't 100% contained within the circle of "computer users savvy with the terminal".

    Quite frankly your stance that the 'average person' doesn't care, when this post is LITERALLY from an 'average person' who does, is the one that seems off base on its face.

  • Sort of, when you hang out with people who are less informed, it's going to get awkward if you're constantly just explaining stuff. Eventually everyone will pick up on the vibe that you know way more than everyone else.

    But I don't hate it. That stuff can be overcome by working on your self and social ability, often the wittiest people can slyly point out the folly of something someone said with a quick wry joke, and more importantly you can learn to just let stuff go and not always explain everything or make sure everyone is exactly right about everything, and instead focus on being a conversationalist that will just keep things flowing in a fun way.

    Plus, then when you do hang out with other people who are well informed, you can have interesting deep conversations. And the world is a lot less scary and hard to navigate if you understand how it all works.

  • Cities still need a way of knowing when streetlights burn out or are in need of service.

    You can wait for people to report them out, hope the report is accurate, and then send workers out to try and find them and fix them (and it's not exactly easy to figure out which light is burnt out during the day), or you can proactively send workers out to fix exactly the right light as soon as they break or show any signs that they might.

  • I'm not talking ecosystem or which I'd choose to build an actual project with, just on a pure language basis, C#'s typing system is more flexible and less verbose than Java's, and unlike Java, C# actually treats functional programming as first class.

    Java has certainly gotten better in both regards, but C# was really just a joy in comparison.

  • If you can't imagine why this is bad, maybe read some Kafka or watch some Black Mirror.

    Lmfao. Yeah, ok, let's get my predictions from the depressing show dedicated to being relentlessly pessimistic at every single decision point.

    And yeah, like I said, you sound like my hysterical middle school teacher claiming that Wikipedia will be society's downfall.

    Guess what? It wasn't. People learn that tools are error prone and came up with strategies to use them while correcting for potential errors.

    Like at a fundamental, technical level, components of a system can be error prone, but still be useful overall. Quantum calculations have inherent probabilities and errors in them, but they can still solve some types of calculations so much faster than normal computers that you can run the same calculation 100x on a Quantum Computer, average out the results to remove the outlying errors, and get to the right answer far faster than a classical computer.

    Computer chips in satellites and the space station are constantly having random bits of memory flipped by cosmic rays, but they still work fine because their RAM is error-correcting ram, that can use similar methods to verify and check for errors.

    And at a super high level, some of my friends and coworkers are more reliable than others, that doesn't mean the ones that are less reliable aren't helpful, it just means I have to take what they say with a grain of salt.

    Designing for error correction is a thing, and people are perfectly capable of doing so in their personal lives.