A bit late
ClamDrinker @ ClamDrinker @lemmy.world Posts 0Comments 258Joined 2 yr. ago
The thing is, games have a minimum difficulty to be somewhat generally enjoyable, and the game designers have often built their game around this. The fun is generally in the obstacles providing real resistance that can be overcome by optimizing your strategy. It means that these obstacles need to be mentally picked apart by the player to proceed. They are built like puzzles.
This design philosophy - anyone who plays these games can tell you - is deeply rewarding if you go through it, because it requires genuine improvement that you can notice and be proud of. Hence why there is often a limit to how much easier you can make games like these without losing that because you forget the obstacle before even realizing it was preventing you from doing something.
It's often not as easy as just tweaking numbers. And often these development teams don't have the time to rebalance a game for those lower difficulties, so they just don't.
Honestly, the first wojack could be quite mad too, because often making an easy game harder also misses the point, where the game is just more difficult, but doesn't actually provide you with that carefully crafted feeling of constant improvement. Instead some easy games can become downright frustrating because obstacles feel "cheap" or "lacking depth" now that you have to spend a lot more time on them.
But making an easy game harder by just tweaking the numbers is definitely easier on the development team, and gives existing players a chance to re-experience the game, which wouldn't happen the other way around. But it's almost certainly not a better option for new players wanting a harder difficulty.
At the end of the day though, often there are ways to get what you want. Either by cheating, modding, or otherwise using 'OP' usables in the game. Do whatever you want to make the game more enjoyable to yourself. But if you make it too easy on yourself you might come out on the other end wondering why other people enjoyed the game so much more than you did.
They said struggle - not that they couldn't. Dont just attribute such a horrible thing based on your own reading. You can have all the empathy for the Russian people but no empathy for the Russian state. After all, the Russian state is also directly responsible for the continuous cold blooded murder of Ukrainian civilians. Not like they gave much warning on February 24th 2022, or in 2014.
Depends on where you live I suppose. Irrational AI hate is something I only really encounter online. Then again my country has pretty good worker protections, so there's less reason to be afraid of AI.
Its funny how something like this get posted every few days and people keep falling for it like its somehow going to end AI. The people that make these models are acutely aware of how to avoid model collapse.
It's totally fine for AI models to train on AI generated content that is of high enough quality. Part of the research to train models is building data sets with a text description matching the content, and filtering out content that is not organic enough (or even specifically including it as a 'bad' example for the AI to avoid). AI can produce material indistinguishable from human work, and it produces material that wasn't originally in the training data. There's no reason that can't be good training data itself.
You can train AI models on AI generated content though. AI collapse only occurs if you train it on bad AI generated content. Bots and people talking gibberish are just as bad for training an AI model. But there are ways to filter that from the training data. Such as language analysis. They will also most likely filter out any lowly upvoted comments, or those edited a long time since their original post date.
And if you start posting now, any sufficiently good AI generated material, which other humans will like and upvote, will not be bad for the model.
I like to think it being able to do this is a feature. Imagination beyond our own.
Perhaps. The world can use more kindness when despite everything, loneliness is at an all time high. It's not a fix but maybe it can be a brake on someone's downwards spiral.
I'd prefer and love to see someone new match George Carlin's level too though, much more than someone trying to become him. I dont think we've quite had a chance to savor the good side of AI yet, but hey you're entitled to your opinion.
That's a pretty sloppy reason. A nuanced topic is not well suited to be explained in anything but descriptive language. Especially if you care about people's livelihoods and passion. I care about my artist friends, colleagues, and acquaintances. Hence I will support them in securing their endeavors in this changing landscape.
Artists are largely not computer experts and artists using AI are buying Microsoft or Adobe or using freebies and pondering paid upgrades. They are also renting rather than buying because everything’s a subscription service now.
I really don't like this characterization of artists. They are not dumb nor incapable of learning. Technical artists exist too. Installing open source AI is relatively easy. Pretty much down to pressing a button. And because it's open source, it's free. Using them to it's fullest effect is where the skill goes, and the artists I know are more than happy to develop their skills.
A far bigger market for AI is for non-artists and scammers to fill up Amazon’s bookstore and the broader Internet full of more trash than it already was.
The existence of bad usage of AI does not invalidate good usage of AI. The internet was already full of bad content before AI. The good stuff is what floats to the top. No sane person is going to pay to read some no name AI generated trash. But people will read a highly regarded book that just happened to be AI assisted.
But the whole premise is silly. Did we demonize cars because bank robbers started using them to escape the police? Did we demonize cameras because people could take exact photo copies of someone else's work? No. We demonized those that misused the tool. AI is no different.
A scammer can generate thousands of garbage images and text without worth, before an artist being assisted by AI can make a single work. Just like a burglar can make more money easily by breaking into someone's house and stealing all their money compared to working a day job for a month. There's a reason these things are illegal and/or unethical. But those are reflections of the people doing this, not the things they use.
I mean, you ignored the entire rest of my comment to respond only to a hyperbole to illustrate that something is a bad argument. I'm sure they are making money off it, but small creators and artists can relatively make more money off it. And you claim that is not 'actually happening'. But that is your opinion, how you view things. I talk with artists daily, and they use AI when it's convenient to them, when it saves them work or allows them to focus on work they actually like. Just like how they use any other tool to their disposal.
I know there are some very big name artists on social media who are making a fuss about this stuff, but I highly question their motives with my point of view in mind. Of course it makes sense for someone with a big social media following to rally up their supporters so they can get a payday. I regularly see them speak complete lies to their followers, and of course it works. When you actually talk to artists in real life, you'll get a far more nuanced response.
Well then we agree. Lets leave ridiculous arguments out of it. There are far better arguments to make.
I don't disagree with that, but such differences can matter when it comes to ruling if imitation and parody are allowed, and to what extent.
The court might rule in favor of his estate for this reason. But honestly, I do think there are differences to a singer (whose voice becomes an instrument in their song) and a comedian (whose voice is used to communicate the ideas and jokes they want to tell). A different voice could tell the same jokes as Carlin, and if done with the same level of care to communicate his emotions and cadence, could effectively create the same feeling as we know it. A song could literally be a different song if you swap an instrument. But the courts will have to rule.
There’s another thing here which is that you seem to believe this was actually made in large part by an AI while simultaneously stating the motivations of humans. So which is it?
AI assisted works are, funnily enough, mostly a human production at this point. If you asked AI to make another George Carlin special for you, it would suck extremely hard. AI requires humans to succeed, it does not succeed at being human. And as such, it's a human work at the end of the day. My opinion is that if we were being truthful, this comedy special would likely be considered AI assisted rather than fully AI generated.
You seem really sure that I think this is fully (or largely) AI generated, but that's never been a question I answered or alluded to believing before. I don't believe that. I don't even believe fully AI generated works to be worthy of being called true art. AI assisted works on the other hand, I do believe to be art. AI is a tool, and for it to be used for art it requires humans to provide input and humans to make decisions for it to be something that people will actually enjoy. And that is clearly what was done here.
The primary beneficiary of all of the AI hype is Microsoft. Secondary beneficiary is Nvidia. These aren’t tiny companies.
"The primary beneficiaries of art hype are pencil makers, brush makers, canvas makers, and of course, Adobe for making photoshop, Samsung and Wacom for making drawing tablets. Not to mention the art investors selling art from museums and art galleries all over the world for millions. These aren't tiny entities."
See how ridiculous it is to make that argument? If something is popular, people and companies who are in a prime position to make money off it will try to do so, that is to be expected under our capitalist society. But small artists and small creators get the most elevation by the advance of open source AI. Big companies can already push out enough money to bring any work they create to the highest standards. A small creator cannot, but they can get far more, and far better results by using AI in their workflow. And because small creators often put far more heart and soul into their works, it allows them to compete with giants more easily. A clear win for small creators and artists.
Just to be extra clear: I don't like OpenAI. I don't like Microsoft. I don't like Nvidia to a certain degree. Open Source AI is not their piece of cake. They like proprietary, closed source AI. The kind where only they and the people that pay them get to use the advancements AI has made. That disgusts me. Open Source AI is the tool of choice for ethical AI.
You're right, South Park doesnt need it either. But a disclaimer removes all doubt. The video doesnt need a disclaimer either, but they made it anyways to remove all doubt. And no, they disclaimed any notion that they are George Carlin. Admitting to a crime in a disclaimer is not what it said, that much should be obvious.
A complete false equivalence. Just because improper disclaimers exist, doesn't mean there aren't legitimate reasons to use them. Impersonation requires intent, and a disclaimer is an explicit way to make it clear that they are not attempting to do that, and to explicitly make it clear to viewers who might have misunderstood. It's why South Park has such a text too at the start of every episode. It's a rather fool proof way to illegitimize any accusation of impersonation.
Healthy or not, my lived experience is that assuming people are motivated by the things people are typically motivated by (e.g. greed, the desire for fame) is more often correct than assuming people have pure motives.
Everyone likes praise to a certain extent, and desiring recognition for what you've made is independent from your intentions otherwise. My personal experience working with talented creative people is that the two are often intertwined. If you can make something that's both fulfilling and economically sustainable, that's what you'll do. You can make something that's extremely fulfilling, but if it doesn't appeal to anyone but yourself, it doesn't pay the bills. I'm not saying it's not possible for them to not have that motivation, but in my opinion anyone ascribed to be malicious must be to some point proven to be that way. I have seen no such proof.
I really understand your second point but... as with many things, some things require consent and some things don't. Making a parody or an homage doesn't (typically) require that consent. It would be nice to get it, but the man is dead and even his children cannot speak for him other than as legal owners of his estate. I personally would like to believe he wouldn't care one bit, and I would have the same basis as anyone else to defend that, because nobody can ask a dead man for his opinions. It's clear his children do not like it, but unless they have a legal basis for that it can be freely dismissed as not being something George would stand behind.
I've watched pretty much every one of his shows, but I haven't seen that documentary. I'll see if I can watch it. But knowing George, he would have many words to exchange on both sides of the debate. The man was very much an advocate for freedom of creativity, but also very much in favor of artist protection. Open source AI has leveled the playing field for people that aren't mega corporations to compete, but has also brought along insecurity and anxiety to creative fields. It's not black and white.
In fact, there is a quote attributed to him which sort of speaks on this topic. (Although I must admit, the original source is of a defunct newspaper and the wayback machine didn't crawl the article)
[On his work appearing on the Internet] It's a conflicted feeling. I'm really a populist, down in the very center of me. I like the power people can accrue for themselves, and I like the idea of user-generated content and taking power from the corporations. The other half of the conflict, though, is that, traditionally speaking, artists are protected from copyright infringement. Fortunately, I don't have to worry about solving this issue. It's someone else's job.
August 9, 2007 in Las Vegas CityLife. So just a little less than a year before his death too.
EDIT: Minor clarification
Completely true. But we cannot reasonably push the responsibility of the entire internet onto someone when they did their due diligence.
Like, some people post CoD footage to youtube because it looks cool, and someone else either mistakes or malicious takes that and recontextualizes it to being combat footage from active warzones to shock people. Then people start reposting that footage with a fake explanation text on top of it, furthering the misinformation cycle. Do we now blame the people sharing their CoD footage for what other people did with it? Misinformation and propaganda are something society must work together on to combat.
If it really matters, people would be out there warning people that the pictures being posted are fake. In fact, even before AI that's what happened after tragedy happens. People would post images claiming to be of what happened, only to later be confirmed as being from some other tragedy. Or how some video games have fake leaks because someone rebranded fanmade content as a leak.
Eventually it becomes common knowledge or easy to prove as being fake. Take this picture for instance:
It's been well documented that the bottom image is fake, and as such anyone can now find out what was covered up. It's up to society to speak up when the damage is too great.
We can argue their motives all we want (I’m pretty uninterested in it personally), but we aren’t them and we don’t even know what the process was to make it
Yes, that is sort of my point. I'm not sure either, but neither did the person I responded to (in my first comment before yours). And to make assumptions with such negative implications is very unhealthy in my opinion.
and I think that is because the whole thing sure would seem less impressive if they just admitted that they wrote it.
It's the first time I hear someone suggest they passed of their own work as AI, but it could also be true. Although AI assisted material is considered to be the same as fully AI generated by some. But again, we don't know.
I laughed maybe once, because the whole thing was not very funny in addition to being a (reverse?) hack attempt by them to deliver bits of their own material as something Carlin would say.
I definitely don't think it meets George's level. But it was amusing to me. Which is about what I'd expect of an homage.
For sure! Deceit should be punished. Ethical AI usage should not go without disclosure, so I think we must be understanding to people choosing to be open about that, rather than having to hide it to dodge hate.
I like Vernor Vinge’s take on it in one of his short stories where copyrights are lessened to 6 months and companies must quickly develop their new Worlds/Characters before they become public domain.
That's an interesting idea. Although 6 months does sound like an awfully short time to actually develop something more grand. But I do think with fairer copyright limits we could also afford to provide more protections in the early days after a work's release. It's definitely worth discussing such ideas to make copyright better for everyone.
The thing is, I've seen statements like this before. Except when I heard it, it was being used to justify ignoring women's experiences and feelings in regard to things like sexual harassment and feeling unsafe, since that's "just a feeling" as well. It wasn't okay then, and it's not okay the other way around. The truth is that feelings do matter, on both sides. Everyone should feel safe and welcome in their surroundings. And how much so that is, is reflected in how those people feel.
The outcome of men feeling being respected and women feeling safe are not mutually exclusive. The sad part is that someone who is reading this here is far more likely to be an ally than a foe, yet the people who need to hear the intended message the most will most likely never hear it nor be bothered by it. There's a stick being wedged here that is only meant to divide, and oh my god is it working.
The original post about bears has completely lost all meaning and any semblance of discussion is lost because the metaphor is inflammatory by design - sometimes that's a good thing, to highlight through absurdity. But metaphors are fragile - if it's very likely to be misunderstood or offensive, the message is lost in emotion. Personally I think this metaphor is just highly ineffective at getting the message across, as it has driven people who would stand by the original message to the other side due to the many uncharitable interpretations it presents. And among the crowd of reasonable people are those who confirm those interpretations and muddy the water to make women seem like misandrists, and men like sexual assault deniers. This meme is simply terrible and perhaps we can move on to a better version of it that actually gets the message across well, instead of getting people at each other's throat.