I see a lot of Dunning Kruger here as well. The fact is that you can generate novel images/texts/whatever with these tools. They may mostly suck but they’re still novel so they can be copyrighted by whoever used these tools to create them.
If it was a compression algorithm then it would be insanely efficient and that’d be the big thing about it. The simple fact is that they aren’t able to reproduce their exact training data so no, they aren’t storing it in a highly compressed form.
The LLMs don’t deserve or have any rights. They’re a tool that people can use. Just like reference material, spellcheckers, asset libraries or whatever else creatives use. As long as they don’t actually violate copyright in the classical sense of just copy pasting stuff the product people generate using them is probably as (un)original as a lot of art out there. And collages can be transformative enough to qualify for copyright.
In reality people learn how to write lyrics because they listen to songs. Nobody writes a song without listening to thousands of them and many human written songs are really similar to each other. Otherwise the music industry wouldn’t be littered with lawsuits. I don’t really see the difference.
I’d say you’re dead once you take a breath of that. It’s probably going to feel like you’re breathing 100% nitrogen because your body will think it’s breathing out CO2 like normal. You’ll get weak really fast since you’re deprived of oxygen and then you just die because your blood cells are ruined.
I don’t think a potentially deadly concentration of carbon monoxide would displace enough oxygen for a flame to go out. The deadly part is that it makes your red blood cells useless once it binds to them.
Of course they calculate quite well if it is worth the effort to get rid of fake reviews. I definitely think that they are something that Amazon would eradicate right now if they could do so easily. So as the quantity and quality of fake reviews is bound to rise with recent technological developments, the scales might tip into the direction of them having to do more about it. Because offering good deals as apology is not something that they’d be happy to do for more and more people.
Where?