Skip Navigation

Posts
17
Comments
1,137
Joined
2 yr. ago

  • They're not illegally harvesting anything. Copyright law is all about distribution. As much as everyone loves to think that when you copy something without permission you're breaking the law the truth is that you're not. It's only when you distribute said copy that you're breaking the law (aka violating copyright).

    All those old school notices (e.g. "FBI Warning") are 100% bullshit. Same for the warning the NFL spits out before games. You absolutely can record it! You just can't share it (or show it to more than a handful of people but that's a different set of laws regarding broadcasting).

    I download AI (image generation) models all the time. They range in size from 2GB to 12GB. You cannot fit the petabytes of data they used to train the model into that space. No compression algorithm is that good.

    The same is true for LLM, RVC (audio models) and similar models/checkpoints. I mean, think about it: If AI is illegally distributing millions of copyrighted works to end users they'd have to be including it all in those files somehow.

    Instead of thinking of an AI model like a collection of copyrighted works think of it more like a rough sketch of a mashup of copyrighted works. Like if you asked a person to make a Godzilla-themed My Little Pony and what you got was that person's interpretation of what Godzilla combined with MLP would look like. Every artist would draw it differently. Every author would describe it differently. Every voice actor would voice it differently.

    Those differences are the equivalent of the random seed provided to AI models. If you throw something at a random number generator enough times you could--in theory--get the works of Shakespeare. Especially if you ask it to write something just like Shakespeare. However, that doesn't meant the AI model literally copied his works. It's just doing it's best guess (it's literally guessing! That's how work!).

  • Element needs to be better. Discord is awesome with the way it auto-plays looping videos/gifs and has animated emojis.

    Seriously: That's all they'd need to do. The element devs need to focus on fun.

  • Permanently Deleted

    Jump
  • Yeah! They are now stuck reading comments like ours! 😤

  • Permanently Deleted

    Jump
  • Hopefully AI will change this. In the future, when anyone claims to have such pictures you can just say it's AI-generated nonsense 🤷

    The world will have to go back to actually verifying claims.

    We also need to teach our children about porn, damnit! Even if someone posts a thousand naked pictures of you to the Internet it will be lost in the sea of naked pictures. Very few will care or notice and no, future employers won't go looking and even if they hear about it why TF would they even care? You can still do the job.

    Anyone who does care that naked pictures exist is obviously lacking moral character and probably needs to re-think what's important in life.

  • Permanently Deleted

    Jump
  • put a hole in their budget that has to be filled.

    Who says it has to be filled? Republicans certainly don't.

  • Also, the most people will die in red states.

    Take of that knowledge what you will.

  • If you don't what the OP here is talking about, both Tesla and Musk have loans that require collateral. If Tesla's stock drops enough it will go below the collateral threshold and the banks to whom the money is owed will come calling.

    Since Musk owns many billions in other things he can easily pay up when that happens but wow will it be expensive and embarrassing. He might even fall out of the top 100 richest!

  • What are you talking about? People emigrating (after an enormous genocide) is how this country was made!

  • Conservatives love the concept that private charities should be the ones who provide, "hand outs" (e.g. free food) and things like health care to those who can't afford it. Except they always conveniently neglect the fact that all the charities in the world combined are but a fraction of a percent of government spending on similar things.

    Furthermore, charities cannot hope to match the efficiency of governments providing the same services. Even the most inefficient government agencies have overhead far lower than the best-run charities.

    Governments don't have to solicit for donations. They aren't beholden to the market or bad economic times. They can pool bureaucratic resources across multiple programs and take advantage of economies of scale that the biggest charities could only dream of.

    The truth is that if you actually want to help the most desperate people of the world, government programs are the most efficient and effective way to do so.

  • Thousands of tons of plastic pollution

    OK so a trivial and insignificant amount.

  • Nope. In fact, if you generate a lot of images with AI you'll sometimes notice something resembling a watermark in the output. Demonstrating that the images used to train the model did indeed have watermarks.

    Removing such imaginary watermarks is trivial in image2image tools though (it's just a quick extra step after generation).

  • ...or trying to get away with as much as possible and seeing what sticks.

  • Reminder: The Bill of Rights applies to all persons living or residing in the United States. Whether or not you're a citizen is irrelevant. Green card, visa, no visa, etc doesn't matter.

    Everyone gets freedom of speech in the US. Everyone.

  • To be fair, when it comes to stock photos the creatives already got paid. You're just violating the copyright of a big corporation at that point (if you distribute the images... If you never distribute the images then you've committed no crime).

  • Why stop at "AI-generated"? Why not have the individual post their entire workflow, showing which model they used, the prompt, and any follow-up editing or post-processing they did to the image?

    In the 90s we went through this same shit with legislators trying to ban photoshopped images (hah: They still try this from time to time). Then there were attempts at legislating mandatory watermarks and similar concepts. It's all the same concept: New technology scary, regulate and restrict it.

    In a few years AI-generated content will be as common as photoshopped images and no one will bat an eye because it'll "just be normal". A photographer might take a picture of a model (or a number of them) for a cover or something then they'll use AI to change the image after. Or they'll use AI to generate an image from scratch and then have models try to copy it. Or they'll just use AI to change small details in the image such as improving lighting conditions or changing eye color.

    AI is very rapidly becoming just another tool in photo/video editing and soon it will be just another tool in document writing and audio recording/music creation.

  • Not a bad law if applied to companies and public figures. Complete wishful thinking if applied to individuals.

    For companies it's actually enforceable but for individuals it's basically impossible and even if you do catch someone uploading AI-generated stuff: Who cares. It's the intent that matters when it comes to individuals.

    Were they trying to besmirch someone's reputation by uploading false images of that person in compromising situations? That's clear bad intent.

    Were they trying to incite a riot or intentionally spreading disinformation? Again, clear bad intent.

    Were they showing off something cool they made with AI generation? It is of no consequence and should be treated as such.

  • If they really want to get Trump and Musk to care the picture should have white children.