Skip Navigation

Posts
17
Comments
1,132
Joined
2 yr. ago

  • Future news story: Trump wanted to use nuclear bomb on Iran but couldn't figure out how to read the codes in the nuclear football.

    "No matter how many times I asked no one brought me a football. They just kept handing me a briefcase!"

  • Something tells me that the MAGA base won't be too happy about this. They want the deportations to happen in their communities.

    "Sorry Texas: Because you're full of MAGA people we're cutting your ICE resources."

  • To me, this is like saying, "4chan has turned into a cesspool!" Yeah: It was like that from the start. YOU were the ones that assumed it was ever safe!

    You're posting stuff on the public Internet to a website for adults where literally anyone can sign up and comment FFS.

    If you want good moderation you need community moderation from people in that community. Not some giant/evil megacorp!

    There's all sorts of tools and platforms that do this properly, easily, and for free. If you don't like Meta's websites move off of them already!

  • Apparently billionaires do give a shit. A lot of it. An entire island of shit... Trickling down to the next layer down of hell rich people.

  • Mods on Xbox only exist for games where the game itself officially added mod support. I mean, sure it's great when a game maker does that but usually it's not as good as community-made mod support because community mods don't require approval and can't get censored/removed because the vendor doesn't like it.

    Remember: Microsoft's vision of mods is what you get with the Bedrock version of Minecraft. Yet the mods available in the Java version are so vastly superior the difference is like night and day.

    Console players—that are used to living without mods—don't understand. Once mods become a regular thing that you expect in popular games going without them feels like going back into the dark ages.

  • All the fun of Windows gaming with the locked-down ecosystem of a console (no mods). What could go wrong?

    It's Windows Mobile all over again.

  • Interestingly, that's how it works for construction jobs too!

    Things will break and they will be back.

  • The courts need to settle this: Do we treat AI models like a Xerox copier or an artist?

    If it's a copier then it's the user that's responsible when it generates copyright-infringing content. Because they specifically requested it (via the prompt).

    If it's an artist then we can hold the company accountable for copyright infringement. However, that would result in a whole shitton of downstream consequences that I don't think Hollywood would be too happy about.

    Imagine a machine that can make anything... Like the TARDIS or Star Trek replicators. If someone walks up to the machine and says, "make me an Iron Man doll" would the machine be responsible for that copyright violation? How would it even know if it was violating someone's copyright? You'd need a database of all copyrighted works that exist in order to perform such checks. It's impossible.

    Even if you want OpenAI, Google, and other AI companies to pay for copyrighted works there needs to be some mechanism for them to check if something is copyrighted. In order to do that you'd need to keep a copy of everything that exists (since everything is copyrighted by default).

    Even if you train an AI model with 100% ethical sources and paid-for content it's still very easy to force the model to output something that violates someone's copyright. The end user can do it. It's not even very difficult!

    We already had all these arguments in the 90s and early 2000s back when every sane person was fighting the music industry and Hollywood. They were trying to shut down literally all file sharing that exists (even personal file shares) and search engines with the same argument. If they succeeded it would've broken the entire Internet and we'd be back to using things like AOL.

    Let's not go back there just because you don't like AI.

  • support policies that promote childbearing.

    ...like making abortion legal in all 50 states again? Like comprehensive sex education in schools? Like treating women as people and not property?

    The most effective way for the Southern Baptists to promote childbearing would be for them to stay the fuck out of politics!

  • Who knows, maybe we might even let them come back to US soil they might leave CECOT alive—some day.

    FTFY.

    When you kidnap people without due process on the regular you're encouraging people to fight back. When you send them to foreign gulags known for literally torturing and killing people and forcing them into slave labor you're encouraging them to fight back with deadly force.

    Dying—fighting for your life—sure sounds better than just giving up and letting ICE take you away. At this point the Trump Administration and ICE cannot be trusted to execute due process. They're operating outside of the law. They are the lawless ones.

    They're hoping for deadly conflict and I fear they're going to get it. Though, on the plus side I'm 100% certain they will be unhappy with the outcome. In both the short and long term.

  • This is why combining religion and government is always a bad idea.

  • To be fair, the world of JavaScript is such a clusterfuck... Can you really blame the LLM for needing constant reminders about the specifics of your project?

    When a programming language has five hundred bazillion absolutely terrible ways of accomplishing a given thing—and endless absolutely awful code examples on the Internet to "learn from"—you're just asking for trouble. Not just from trying to get an LLM to produce what you want but also trying to get humans to do it.

    This is why LLMs are so fucking good at writing rust and Python: There's only so many ways to do a thing and the larger community pretty much always uses the same solutions.

    JavaScript? How can it even keep up? You're using yarn today but in a year you'll probably like, "fuuuuck this code is garbage... I need to convert this all to [new thing]."

  • Define, "reasoning". For decades software developers have been writing code with conditionals. That's "reasoning."

    LLMs are "reasoning"... They're just not doing human-like reasoning.

  • I'm not convinced that humans don't reason in a similar fashion. When I'm asked to produce pointless bullshit at work my brain puts in a similar level of reasoning to an LLM.

    Think about "normal" programming: An experienced developer (that's self-trained on dozens of enterprise code bases) doesn't have to think much at all about 90% of what they're coding. It's all bog standard bullshit so they end up copying and pasting from previous work, Stack Overflow, etc because it's nothing special.

    The remaining 10% is "the hard stuff". They have to read documentation, search the Internet, and then—after all that effort to avoid having to think—they sigh and start actually start thinking in order to program the thing they need.

    LLMs go through similar motions behind the scenes! Probably because they were created by software developers but they still fail at that last 90%: The stuff that requires actual thinking.

    Eventually someone is going to figure out how to auto-generate LoRAs based on test cases combined with trial and error that then get used by the AI model to improve itself and that is when people are going to be like, "Oh shit! Maybe AGI really is imminent!" But again, they'll be wrong.

    AGI won't happen until AI models get good at retraining themselves with something better than basic reinforcement learning. In order for that to happen you need the working memory of the model to be nearly as big as the hardware that was used to train it. That, and loads and loads of spare matrix math processors ready to go for handing that retraining.

  • The only reason we're not there yet is memory limitations.

    Eventually some company will come out with AI hardware that lets you link up a petabyte of ultra fast memory to chips that contain a million parallel matrix math processors. Then we'll have an entirely new problem: AI that trains itself incorrectly too quickly.

    Just you watch: The next big breakthrough in AI tech will come around 2032-2035 (when the hardware is available) and everyone will be bitching that "chain reasoning" (or whatever the term turns out to be) isn't as smart as everyone thinks it is.

  • Pressing down too hard breaks the pushbutton functionality. It has nothing to do with stick drift.

    But since we're talking about what causes things... You know what actually causes potentiometer-based sticks to fail fast? Sweat. That's right!

    The NaCL in your sweat—even the tiniest microscopic amounts—is enough to degrade the coating and the brushes on potentiometers. The more your hands sweat, the faster your sticks will degrade.

    Got sweaty palms? Best to use hall effect sticks or save up to buy new ones on the regular! 😁

    Also: If you allow your controllers to get really cold and regularly (and rapidly) warm them up with your hands while playing that can have a negative impact too.

  • At scale a hall effect stick is about $0.25 more than a potentiometer version. That's about $38,000,000 if they sell as many Switch 2s as they sold Switches.

    Sooooo... Nothing. That's basically a rounding error to Nintendo. Remember: That figure is over eight years.

    If it means they won't have lawsuits (which cost millions on their own), fewer returns, and happier customers it most certainly would be worth losing out on ~$5 million/year.

    The part you're missing isn't the cost. It's the potential sales from replacement joycons. If you're going to make a devil's advocate style, capitalist argument that's the one to make.

    I don't think it's any of that, though. I think it's just management being too strict about design constraints (which I pointed out in an earlier comment).

  • I design things that use hall effect sensors... The magnets in the joycons would not have interfered. Those magnets are:

    1. Too far away from the sticks to matter.
    2. Perpendicular/orthogonal to the magnets that would be in the sticks.

    Besides, you can cram hall effect stuff super tight just by inserting a tiny piece of magnetic shielding between components. Loads of products do this (mostly to prevent outside magnets from interfering but it's the same concept). What is this magic magnetic shielding technology? EMI tape.

    There's a zillion types and they're all cheap and very widely used in manufacturing. I guarantee your phone, laptop, and many other electronics you own have some sort of EMI tape inside of them.

    Just about every assembly line that exists for mass produced electronics has at least one machine that spits out tape a bit like a CNC machine (or they pay the cheapest worker possible to place it).

  • Note: Hall effect sticks aren't that much more expensive than potentiometer sticks (difference is less than a dollar at scale). However, they require more space than potentiometer sticks and if you're doing something custom (which Nintendo always does) it can be a great big expense to change your manufacturing processes to insert tiny magnets into injection molded parts.

    I suspect the latter is the reason why they abandoned using hall effect or TMR sticks for the Switch 2.

    My wild speculation: Nintendo probably gave their engineers some design constraints that limited their ability to use off-the-shelf HE parts (everything I've seen really is too big). Rather than change the constraints slightly in order to make the product usable with such parts they stayed stubborn in the hopes that their engineers would come up with an innovative solution. This sort of thing can work to force innovation at really big companies—if they're not super top-down in terms of decision making.

    I'm sure that the Nintendo engineers came up with their own perfectly-workable HE/TMR stick designs but had them shot down in meetings where they discussed the manufacturing costs.