Why are modern games obsessed with parrying? | Semi-Ramblomatic
PlzGivHugs @ PlzGivHugs @sh.itjust.works Posts 38Comments 510Joined 2 yr. ago
I believe Jack has done a bit for his anti-freebooting campaign, but to my knowledge, it hasn't been much and this is by-far the most formal.
I'm assumimg you have a store key.
You could post to one of the relevant communities for game giveaaways, such as !freegames@feddit.uk or !Randomactsofgaming@lemm.ee
Standard practice is to pretty much just say what game you're offering, and the dm the key to a random respondent after a day or two.
The only one that comes to mind is Club Penguin's coffee shop. That one is burned into my memory.
My point of contention is that the arguments you're using are flawed, not your intentions. OpenAI, Meta, Disney, ect. are in the wrong because they pirate/freeboot and infringement on independent artist's licenses. It's not their use of technology or the derivative nature of the works it produces that are the problem: making AI the face of the issues moves the blame away from the companies, and allows them to continue to pirate/freeboot/plagiarize (or steal, as you define it) from artists.
Yes, part of my point is that capitalism is bad, but thats further up the chain than what I was arguing. My point is that copyright law and more importantly, its implementation and enforcement is broken. Basically all your issues originate not with AI but with the fact that independent artists have no recourse when their copyrights are violated. AI wouldn't be an issue if AI compananies actually paid artists for their work, and artists could sue companies who infringe on their rights. The problem is that artists are being exploited and have no recourse.
Using allegory to hopefully make my point a bit more clear: Imagine you have a shop of weavers (artists). The comapny running the shop brings in a loom (AI), and starts chaining their workers to it and claiming its an Automatic Weaver™ (pirating and violating artists rights). The problem isn't the loom, and blaming it shifts blame away from whoever it was that decided to enslave their workers. Trying to ban the loom doesn't prevent the shop from just chaining the workers to their desks, as was often done in the past, nor does it prevent them from bringing in Automatic Potters™. If you want to stop this, even ignoring the larger spectre of capitalism, it should be slavery that is outlawed (already done) and punished (not done), not the use of looms.
If you are trying to fix/stop the current state of AI and prevent artists from being exploited by massive companies in this way, banning AI will only slow it and will limit potentially useful technology (that artists should be paid for). Rather than tackle one of the end results of rhe problem, you need to target it closer to its root - the fact that large companies can freely pirate, freeboot, and plagiarize smaller artists.
It isn't current AI voice tech that was an issue. It was the potential for future AI they were worried about. AI voices as they are now, are of similar quality to pulling someone off the street and putting them in front of a mid-range mic. If you care about quality at all, (without massive changes to how AI tech functions) you'll always need a human.
And to be clear, what about AI makes it the problem, rather than copyright? If I can use a voice synthesizer to replicate an actors voice, why is that fine and AI not? Should it not be that reproduction of an actor's voice is right or wrong based on why its done and its implications rather than because of the technology used to replicate it?
Edit: And to be clear, just because a company can use it as an excuse to lower wages, doesn't mean its a viable alternative to hiring workers. Claims that they could replace their workers with AI is just the usual capitalist bullshit excuses to exploit their workers.
Big movie studios will use it to generate parts (and eventually all) of a movie. They can use this as leverage to pay the artists less and hire fewer of them. Animators, actors, voice actors.
Only if its profitable, and given that AI output is inherently very limited, it won't be. AI can only produce lower quality, derivative works. In isolation, some works might not be easy to distinguish, but thats only on a small scale and in isolation.
If a movie studio pirated work and used it in a film, that's against copyright and we could sue them under current law.
But if they are paying openAI for a service, and it uses copyrighted material, since openAI did the stealing and not the studio then it's not clear if we can sue the studio.
You can sue the studio. In the same way, you would sue the studio if an artist working there (or even someone directing artists) creates something the violates copyright, even by accedent. If they publish a work that infringes on copyright, you can sue them.
Seems like it's being argued that because of the layer of abstraction that is created when large quantities of media is used, rather than an individual's work, that it's suddenly a victimless crime.
By that logic, anything that takes inspiration, no matter now broad, or uses anothers work in any way, no matter how transformative, should be prevented from making their own work. That is my point. AI is just an algorithm to take thousands of images and blends them together. It isn't evil, any more than a paint brush is. What is, is piracy for commercial use, and non-transformative copyright infringement. Both of these are already illegal, but artists can't do anything about it, not because companies haven't broken the law, but rather because an independent author trying to take, for example, Meta to court is going to bankrupt themselves.
Edit: Also notable in companies using/not using AI, is the fact that even transformative and """original""" AI work cannot be copyrighted. If Disney makes a movie thats largely AI, we can just share it freely without paying them.
No, it is theft. They use an artist's work to make an image they would otherwise pay the artist to make (a worse version, but still). And given how I've seen an image with a deformed patreon logo in the corner, they didn't pay what they should have for the images. They stole a commission.
But were they (the AI users) going to pay for the content? I have never paid for a Patreon, given that I don't really have any disposable income. Why would I start, just because AI exists? Just because a sale may be made in some contexts, doesn't mean it has been made.
And it is copyright violation. There have been successful lawsuits over much less than a direct image of RDJ in the iron man suit with the infinity stones on his hand.
Its a copyright violation when material is made that violates existing copyright. It isn't copyright infringement to take data from media, or to create derivative works.
And if they won't pay an artist's rates, there's no way they'd pay whatever Disney would charge them
Disney has lawers. Small artists don't.
AI is a nazi-built, kitten blood-powered puppy kicking machine built from stolen ambulance parts. Even if stealing those ambulance parts is a lesser sin than killing those kittens, it's still a problem that needs to be fixed. Of course, AI will never be good, so we need to get rid of the whole damn thing.
Banning AI doesn't stop the Nazis from running the government or influencing the populus, it doesn't stop them burning the planet, it doesn't stop them from pirating work and otherwise exploiting artists. Hell, politicians have been doing all of these things without repercussions for a century. If you want the rich and powerful to stop pirating and freebooting artist's work, maybe the first step is to ban that (or rather, enforce it) rather than a technology two steps removed?
AI images try to replicate the style of popular artists by using their work, often including work that was behind a paywall and taken without payment, thus denying the artists revenue. AI has taken something from the artist, and cost the artist money. Until such a time as we come up with a new word for this new crime, we'll call it by the closest equivalent: theft.
I'd argue it's much closer to piracy or freebooting. Generally, its use doesn't hurt artists, seeing as a random user isn't going to spend hundreds or thousands to hire a talented artist to create shitposts for them. Doesn't necessary make it okay, but it also doesn't directly hurt anyone. In cases of significant commercial use, or copyright infringement, I'd argue its closer to freebooting: copying another's work, and using it for revenue without technically directly damaging the original. Both of these are crimes, but both are more directly comparable and less severe than actual theft, seeing as the artist loses nothing.
Also, someone did an experiment and typed "movie screenshot" into an AI and it came back with a nearly identical image from Endgame. Not transformative enough to be anything but copyright infringement.
Copyrighted material is fed into an AI as part of how it works. This doesn't mean than anything that comes out of it is or is not copyrighted. Copyrighted matterial is also used in Photoshop, for example, but as long as you don't use Photoshop to infringe on somsone else's copyright, there isn't anything intrinsically wrong with Photoshop's output.
Now, if your compaint is that much of the training data is pirated or infringes on the licensing its released under, thats another matter. Endgame isn't a great example, given that it can likely be bought with standard copyright limitations, and ignoring that, its entirely possible Disney has been paid for their data. We do know huge amounts of smaller artists have had their work pirated to train AI, though, and because of the broken nature of our copyright system, they have no recourse - not through the fault of AI, but corrupt, protectionist governments.
All that said, theres still plenty of reasons to hate AI (and esspecially AI companies) but I don't think the derivative nature of the work is the primary issue. Not when they're burning down the planet, flooding our media with propaganda, and bribing goverments, just to create derivative, acceptable-at-best """art""". Saying AI is the problem is an oversimplification - we can't just ban AI to solve this. Instead, we need to address the problematic nature of our copyright laws, legal system, and governments.
We have first entrance, yes, but what about second entrance?
From my understanding, its an Atlantic accent, although one that has mostly disappeared over the last few decades.
It wasn't a shadow drop, but it was relatively light on PR. They had one big trailer six months before launch some gameplay videos, and of course the store page available.
Here is the article this article is using for its source: https://www.nytimes.com/2025/05/19/world/europe/russia-finland-border.html
Edit: Yes, the article isn't great, and clearly has a Russian bias. I was just linking it becuase its the original, and more complete source.
I think two main things need to happen: increased transparency from AI companies, and limits on use of training data.
In regards to transparency, a lot of current AI companies hide information about how their models are designed, produced, weighted and use. This causes, in my opinion, many of the worst effects of current AI. Lack of transparency around training methods mean we don't know how much power AI training uses. Lack of transparency in training data makes it easier for the companies to hide their piracy. Lack of transparency in weighting and use means that many of the big AI companies can abuse their position to push agendas, such as Elon Musk's manipulation of Grok, and the CCP's use of DeepSeek. Hell, if issues like these were more visible, its entirely possible AI companies wouldn't have as much investment, and thus power as they do now.
In terms of limits on training data, I think a lot of the backlash to it is over-exaggerated. AI basically takes sources and averages them. While there is little creativity, the work is derivative and bland, not a direct copy. That said, if the works used for training were pirated, as many were, there obviously needs to be action taken. Similarly, there needs to be some way for artists to protect or sell their work. From my understanding, they technically have the legal means to do so, but as it stands, enforcement is effectively impossible and non-existant.
Do you have a source for that graph? I'm interested in the study, but couldn't find it with a web search.
You underestimate how bad we can be at cooking. It takes me like an hour just to peal and chop up ingredients for even a simple dish like mashed potatoes or stir fry.
It does exist, but it's not common. My guess as to that its multifaceted. First and biggest is obviously sexism. Women are seen as weak and mean expected to take care of them. Anything other than this is taboo, and in more conservative areas, seen as wrong. The only reason it's at all accepted in kink is that its purely behind closed doors, so people get less chance to attack it. I also think that most women just enjoy the status quo. Whether this is because women like to be submissive, or just that people in general like to be submissive and women don't have to hide it is impossible to tell. Regardless, a lot of people are perfectly happy with the current expectations. Finally, I think its just momentum. It takes time for things to change.
If you do want to find more stuff like this, the main terms to look for are, "role reversal" for nuclear-family-esk relationships with reversed gender roles, "female led relationship" for generally more intense women in charge, or "gentle femdom" for more gentle and caring stuff, but more sex focused. As much as I hate to recommend it, Reddit is probably the best spot to look, seeing as there isn't really any other aggregators for something that niche and taboo.
Its not inherently a bad idea, but you need to keep in mind that is a crutch and not a cure. Substances won't solve your problems, and shouldn't be your sole, chosen treatment.
That said, just as crutches can help you survive and are often healthier overall than bedrest, so too can substances help. For example, if you have severe, chronic depression, and can't afford medication, while weed won't help (and may make it slightly worse overall), it might help you remain functional durring depressive episodes. Its no medication, but if its keeps you alive long enough to keep working towards a long-term solution, its worth it.
Permanently Deleted
Lemmy was originally founded by political extremists who wantted a space for their politics (tankies.) Its since grown past that, but that inflence is still present in many ways, most prominently in the influences of .ml. On top of this, politics is something inflammatory (and thus engaging) that affects everyone. Because its both engaging and broad-appeal, its going to be something everyone talks about. On the other hand, many niches, aside from being niche are often less inherently engaging (IE talking about a finished TV show). This makes it very hard to get the critical mass needed for a community to snowball into relevance. This means that (effectively) all you're left with is the political communities and a couple niches that are broad appeal enough and have active enough users to be stable.
Yes, its ableist. Excluding, insulting or discriminating against people for disability is basically the definition, reguardless of what they claim. Saying they're not ableist doesn't change it, any more than saying something like, "I'm not racist, but..."
It depends on how (and where) its implemented is his point. It needs to be woven into the comvat system as it is in FromSoft, Batman, Ultrakill, or Cuphead, not tacked on because its easy or popular. Each of those uses parrying in a different way to enhance its combat. On the other hand, if you take these mechanics without the greater context or understanding of why it works, then it'll tends to stand out as bad, or remain unused. Doom Eternal is an example that immediately comes to mind. The whole game is about fast paced combat, with a plethora of new mobility mechanics, that is, until you encounter one of the enemies you need to parry. Then, the game comes to a grinding halt while you wait for the enemy to take action, so you are able to react, completely opposite the rage-fueled persona and the mobility focus of every other mechanic. Compare that to Ultrakill, where parrying isn't just a reactive way to mitigate damage, its a situational attack that allows you to keep moving and keep up your carnage.
Game mechanics work best when they're cohesive. Parrying, due to its simplicity can be tacked on easily, breaking this cohesiveness if not given the same weight as the rest of the mechanics.