Skip Navigation

Posts
17
Comments
1,135
Joined
2 yr. ago

  • stolen copied creations

    When something is stolen the one who originally held it no longer has it anymore. In other words, stealing covers physical things.

    Copying is what you're talking about and this isn't some pointless pedantic distinction. It's an actual, real distinction that matters from both a legal/policy standpoint and an ethical one.

    Stop calling copying stealing! This battle was won by every day people (Internet geeks) against Hollywood and the music industry in the early 2000s. Don't take it away from us. Let's not go back to the, "you wouldn't download a car" world.

  • I dunno. It's better than their old, non-AI slop 🤷

    Before, I didn't really understand what they were trying to communicate. Now—thanks to AI—I know they weren't really trying to communicate anything at all. They were just checking off a box 👍

  • My argument is that the LLM is just a tool. It's up to the person that used that tool to check for copyright infringement. Not the maker of the tool.

    Big company LLMs were trained on hundreds of millions of books. They're using an algorithm that's built on that training. To say that their output is somehow a derivative of hundreds of millions of works is true! However, how do you decide the amount you have to pay each author for that output? Because they don't have to pay for the input; only the distribution matters.

    My argument is that is far too diluted to matter. Far too many books were used to train it.

    If you train an AI with Stephen King's works and nothing else then yeah: Maybe you have a copyright argument to make when you distribute the output of that LLM. But even then, probably not because it's not going to be that identical. It'll just be similar. You can't copyright a style.

    Having said that, with the right prompt it would be easy to use that Stephen King LLM to violate his copyright. The point I'm making is that until someone actually does use such a prompt no copyright violation has occurred. Even then, until it is distributed publicly it really isn't anything of consequence.

  • Challenges with the firmware: Mostly just learning embedded rust. It's a bit different from regular rust because your don't have access to std (which means no Vec!).

    I remember having the hardest time just organizing the code. As in, I wanted to divide everything up into logical segments like, "LEDs", "multiplexers", "infrared", etc and rust makes it kinda hard to do that without making everything it's own little crate. Specifically, if you want your code to be device-agnostic. If you only care about your one board then it's easy and doesn't matter so much 🤷

    I got the boards made at JLCPCB because they're the cheapest and seemed good enough 🤷

  • I used kicad and wrote the firmware in Rust from scratch 👍

  • Nah just own it. You only need to kill this one person to obtain your shapeshifter dreams!

    Admit it, "I'd say sorry but I'd still pull the trigger." 😁

  • If we're going pie in the sky I would want to see any models built on work they didn't obtain permission for to be shut down.

    I'm going to ask the tough question: Why?

    Search engines work because they can download and store everyone's copyrighted works without permission. If you take away that ability, we'd all lose the ability to search the Internet.

    Copyright law lets you download whatever TF you want. It isn't until you distribute said copyrighted material that you violate copyright law.

    Before generative AI, Google screwed around internally with all those copyrighted works in dozens of different ways. They never asked permission from any of those copyright holders.

    Why is that OK but doing the same with generative AI is not? I mean, really think about it! I'm not being ridiculous here, this is a serious distinction.

    If OpenAI did all the same downloading of copyrighted content as Google and screwed around with it internally to train AI then never released a service to the public would that be different?

    If I'm an artist that makes paintings and someone pays me to copy someone else's copyrighted work. That's on me to make sure I don't do that. It's not really the problem of the person that hired me to do it unless they distribute the work.

    However, if I use a copier to copy a book then start selling or giving away those copies that's my problem: I would've violated copyright law. However, is it Xerox's problem? Did they do anything wrong by making a device that can copy books?

    If you believe that it's not Xerox's problem then you're on the side of the AI companies. Because those companies that make LLMs available to the public aren't actually distributing copyrighted works. They are, however, providing a tool that can do that (sort of). Just like a copier.

    If you paid someone to study a million books and write a novel in the style of some other author you have not violated any law. The same is true if you hire an artist to copy another artist's style. So why is it illegal if an AI does it? Why is it wrong?

    My argument is that there's absolutely nothing illegal about it. They're clearly not distributing copyrighted works. Not intentionally, anyway. That's on the user. If someone constructs a prompt with the intention of copying something as closely as possible... To me, that is no different than walking up to a copier with a book. You're using a general-purpose tool specifically to do something that's potentially illegal.

    So the real question is this: Do we treat generative AI like a copier or do we treat it like an artist?

    If you're just angry that AI is taking people's jobs say that! Don't beat around the bush with nonsense arguments about using works without permission... Because that's how search engines (and many other things) work. When it comes to using copyrighted works, not everything requires consent.

  • Well, I was on my phone at the time and that's what ChatGPT generated. Here's what FLUX Dev generated (locally on my PC):

  • portrait of a cartoon man squinting his eyes skeptically at the camera. Text at the top that says, "NOT SURE IF AI MEMES ARE A THING"

  • I mean, I get it: To people like Benedict, DEI is a huge problem! With DEI programs in place a company can reject someone like him because they already have enough assholes 🤷

  • To add to this: It's much more likely that AI will be used to improve compilers—not replace them.

    Aside: AI is so damned slow already. Imagine AI compiler times... Yeesh!

  • Make no mistake: This is the world conservatives want! This is the desired outcome. This is their law. They were warned things like this would happen. They didn't care.

  • No. It ran... Linux.

  • I just wrote a novel (finished first draft yesterday). There's no way I can afford professional audiobook voice actors—especially for a hobby project.

    What I was planning on doing was handling the audiobook on my own—using an AI voice changer for all the different characters.

    That's where I think AI voices can shine: If someone can act they can use a voice changer to handle more characters and introduce a great variety of different styles of speech while retaining the careful pauses and dramatic elements (e.g. a voice cracking during an emotional scene) that you'd get from regular voice acting.

    I'm not saying I will be able to pull that off but surely it will be better than just telling Amazon's AI, "Hey, go read my book."

  • Republicans: We want Americans to have more babies!

    The science on how to make more babies: Stop electing Republicans!

  • Ok so what do we want? Toxic plastics that last forever or toxic plastics that break down in the environment after 3-5 years?

    Because that is the gambit here. We're not going going to just get rid of plastics altogether.

    Also, this article is setting off my BS meter by claiming plastics contain 16,000 toxic substances but not showing how much of that is realistically possible to get into your body. The dose makes the poison!

    "This spider contains 1300 toxic substances—one of which will kill you if even a tiny droplet gets in your blood! And these spiders are out in the environment!"

  • ...and not stable enough! Not only that but if they're paying $40/hour you can be sure as shit they're never going to raise that rate as long as you work there.

    Also, a huge chunk of your paycheck is going to go to useless health insurance and it's not like these companies are banding together to lobby Congress to make it so they don't have to pay for that anymore (by demanding a single payer system).