Skip Navigation

User banner
Posts
0
Comments
802
Joined
2 yr. ago

  • Man, that's a museum piece at this point. It's what, mid-80s? If it looks bright, stable and clean without any maintenance that's... increasingly rare.

  • Wait, is that stretched to 16:9? I may have to rescind my upvote.

    But also... cool.

  • I don't disagree on principle, but I do think it requires some thought.

    Also, that's still a pretty significant backstop. You basically would need models to have a way to check generated content for copyright, in the way Youtube does, for instance. And that is already a big debate, whether enforcing that requirement is affordable to anybody but the big companies.

    But hey, maybe we can solve both issues the same way. We sure as hell need a better way to handle mass human-produced content and its interactions with IP. The current system does not work and it grandfathers in the big players in UGC, so whatever we come up with should work for both human and computer-generated content.

  • That's not "coming", it's an ongoing process that has been going on for a couple hundred years, and it absolutely does not require ChatGPT.

    People genuinely underestimate how many of these things have been an ongoing concern. A lot like crypto isn't that different to what you can do with a server, "AI" isn't a magic key that unlocks automation. I don't even know how this mental model works. Is the idea that companies who are currently hiring millions of copywriters will just rely on automated tools? I get that yeah, a bunch of call center people may get removed (again, a process that has been ongoing for decades), but how is compensating Facebook for scrubbing their social media posts for text data going to make that happen less?

    Again, I think people don't understand the parameters of the problem, which is different from saying that there is no problem here. If anything the conversation is a net positive in that we should have been having it in 2010 when Amazon and Facebook and Google were all-in on this process already through both ML tools and other forms of data analysis.

  • I'm gonna say those circumstances changed when digital copies and the Internet became a thing, but at least we're having the conversation now, I suppose.

    I agree that ML image and text generation can create something that breaks copyright. You for sure can duplicate images or use copyrighted characterrs. This is also true of Youtube videos and Tiktoks and a lot of human-created art. I think it's a fascinated question to ponder whether the infraction is in what the tool generates (i.e. did it make a picture of Spider-Man and sell it to you for money, whcih is under copyright and thus can't be used that way) or is the infraction in the ingest that enables it to do that (i.e. it learned on pictures of Spider-Man available on the Internet, and thus all output is tainted because the images are copyrighted).

    The first option makes more sense to me than the second, but if I'm being honest I don't know if the entire framework makes sense at this point at all.

  • A lot of this can be traced back to the invention of photography, which is a fun point of reference, if one goes to dig up the debate at the time.

    In any case, the idea that humans can only produce so fast for so long and somehow that cleans the channel just doesn't track. We are flooded by low quality content enabled by social media. There's seven billion of us two or three billion of those are on social platforms and a whole bunch of the content being shared in channels is created by using corporate tools to make stuff by pointing phones at it. I guarantee that people will still go to museums to look at art regardless of how much cookie cutter AI stuff gets shared.

    However, I absolutely wouldn't want a handful of corporations to have the ability to empower their employed artists with tools to run 10x faster than freelance artists. That is a horrifying proposition. Art is art. The difficulty isn't in making the thing technically (say hello, Marcel Duchamp, I bet you thought you had already litgated this). Artists are gonna art, but it's important that nobody has a monopoly on the tools to make art.

  • It's not right to say that ML output isn't good at practical tasks. It is and it's already in use and has been for ages. The conversation about these is guided by the relatively anecdotal fact that chatbots and image generation got good so this stuff went viral, but ML models are being used for a bunch of practical uses, from speeding up repetitive, time consuming tasks (e.g. cleaning up motion capture, facial modelling or lip animation in games and movies) or specialized tasks (so much science research is using ML tools these days).

    Now, a lot of those are done using fully owned datasets, but not all, and the ramifications there are also important. People dramatically overestimate the impact of trash product flooding channels (which is already the case, as you say) and dramatically underestimate the applications of the underlying tech beyond the couple of viral apps they only got access to recently.

  • Yep. The effect of this as currently framed is that you get data ownership clauses in EULAs forever and only major data brokers like Google or Meta can afford to use this tech at all. It's not even a new scenario, it already happened when those exact companies were pushing facial recognition and other big data tools.

    I agree that the basics of modern copyright don't work great with ML in the mix (or with the Internet in the mix, while we're at it), but people are leaning on the viral negativity to slip by very unwanted consequences before anybody can make a case for good use of the tech.

  • I think viral outrage aside, there is a very open question about what constitutes fair use in this application. And I think the viral outrage misunderstands the consequences of enforcing the notion that you can't use openly scrapable online data to build ML models.

    Effectively what the copyright argument does here is make it so that ML models are only legally allowed to make by Meta, Google, Microsoft and maybe a couple of other companies. OpenAI can say whatever, I'm not concerned about them, but I am concerned about open source alternatives getting priced out of that market. I am also concerned about what it does to previously available APIs, as we've seen with Twitter and Reddit.

    I get that it's fashionable to hate on these things, and it's fashionable to repeat the bit of misinformation about models being a copy or a collage of training data, but there are ramifications here people aren't talking about and I fear we're going to the worst possible future on this, where AI models are effectively ubiquitous but legally limited to major data brokers who added clauses to own AI training rights from their billions of users.

  • Maybe? Copyright is completely broken if you ask me. Right now we seem to operate on a "don't enforce unless we feel like it" worldwide framework across the board and everything is weird and bad.

  • I mean... yeah, retailer gut checks were a major driver for the industry for ages. The entire myth of the videogame crash in the early eighties, blown out of proportion as it is, comes down to retailers having a bad feeling about gaming after Atari. I'm big on preservation and physical media, but don't downplay the schadenfreude caused by the absolutely toxic videogame retail industry entirely collapsing after digital distribution became a thing. I'll buy direct to consumer from boutique retailers all day before I go back to buckets of games stolen from little kids and retailers keeping shelf space hostage based on how some rep's E3's afterparties went.

    That said, those guys really did flood the market with cookie cutter games in a very short time there for a while. There were a LOT of these.

    Weirdly, Neverwinter Nights must have done extremely well for how much credit Bioware gives it for redefining the genre, but at the time I remember being frustrated by it. It looked worse than the 2D stuff, the user generated content stuff was fun to mess with it didn't create the huge endless content mill you'd expect from something like that today.

    I should go look up if there's any data about how commercially successful it really was somewhere. Any pointers?

  • It means people wanted a way to separate JRPGs from western fantasy RPGs and tabletop or pen-and-paper RPGs.

    Off the top of my head I'm struggling to remember if the term caught up per opposition to pen and paper being the default RPG or to JRPG first, because JRPGs didn't get popular everywhere at once, but CRPGs were big in all Western territories pretty much right away.

  • It seems like it would have been hard to avoid acknowledging the mistake, given that the mistake was very clearly lodged into somebody's backyard, as opposed to still being attached to the rest of the plane, but alright.

    Hey, some people can have a human interaction when doing damage control during a crisis, and apparently this CEO I didn't know about until just now isn't one of those. There are now two different lessons to take away from this, apparently.

    For the record, flashy as this thing was it's not that big of a deal, but it sure is funny and spectacular.

  • I don't hate. I like a good keyboard.

    Now, do I think obsessing about the extremely specific properties of switches and keycaps and spending hours manually embedding each individual key component just to get a specific color combination makes sense as a hobby? Hell no. But then neither does collecting stamps or watching people's grocery runs on Youtube. You do what you want, and this hobby at least lets you put whatever icon you please on the Bixby button.

    I'll say this, though, that justification, which I have used often to myself and others, is a terrible rabbit hole of mismanaged finances. That is true of your monitor, your PC, you laptop, your phone, your keyboard, your chair, your desk... by the time you're done you've spent a year's salary setting up your workstation with absurdly luxurious, custom gear that sometimes makes no discernible difference. By all means get whatever stuff saves you from injury and provides comfort and satisfaction, but we all know in many of those categories the quality curve flattens out way before the price curve does.

    Also, I guarantee most people with a custom keyboard swap it out more often than people who are still using the crappy board that came for free with their prebuilt or was given to them at work. I have dirt cheap Dell keyboards that still work fine. I may not love how they feel or sound, but it turns out we mastered the art of making buttons a while ago and closing a circuit with a conductive pad reliably is not a particularly costly proposition. Hey, buy good keyboards for the feel or because you have a glitzy hobby, but don't lie to yourself or me about it. You're a grown person, own that superfluously expensive nerdy taste. If boomers could brag about their fountain pens you can smugly bore your friends talking about the injection molding in the keycaps matching a specific pantone that you bought.

  • Once the superheroes start to go it gets weird, because at some point the likeness is the least of the issues there. You'd probably want to redesign the costume anwyay. Once you can publish stories with Superman or Batman and use the names and at least some of the core cast why stick to the rest of the package, given how constantly it cycles.

    Only it's all still going to be a minefield. Famously the Sherlock Holmes guys were out there trying to sue Netflix for having their Holmes be too emotional, which they argued was still protected. I mean, they lost, but outside of the fan productions that already exist are you gonna bet your business on that?

  • The Shadow is coming up in a couple of years. Conan is up in a few. I think Peter Pan and Poirot were in this batch already, too. Tintin would be due next year, but for all the crap Disney gets, apparently its term is longer because it only starts counting after the author dies.

  • OK, what I'm increasingly getting from this thread is that one-off kinda scammy touristy places get over-reported and maybe mixed up with outdoor stand-alone toilets? Stuff gets presented like "in EU you have to pay for public toilets" in clickbaity travel articles, but it seems to be more like people were in one scammy place that was chargning and that's what gets talked about? Maybe I just don't go to enough tourist traps.

  • Where was this? The times I've been in France I was there with friends and I've been in Paris for maybe four hours in my entire life, but that sounds like it was either in the 90s or you were being scammed in more ways than the toilet.

    I mean, what I can tell you is I'd definitely found a different toilet unless this was a free-standing outdoors latrine and I was in a hell of a hurry, just based on the fee, let alone the squatting toilet thing.

  • As fas as I know there's nothing keeping restaurants or bars from charging to use the toilets. Also as far as I know, and I've used public toilets in restaurants and bars in most of the countries you list many, many times over several decades, those are exceedingly rare and absolutely not the norm. That was true 40 years ago and it's true today.

    The type of toilet is a different thing and yeah, until maybe the late 90s a lot of Europe was no stranger to squatting toilets. Honestly, for pubs and places where you're mostly disposing of the drinks you're having, I'm not even sure they're a bad idea. Less accessible and whatnot, but I'm not sure a sit down toilet with a carefully developed patina of beer urine developed over years of sloppy drunken aim is a safer or cleaner proposition.

  • Yep, that was my point. There's nothing fundamentally alien to using desktop Linux for most tasks when it's standardized and preinstalled, you see that with the Raspberry Pi and Steam OS and so on. The problem is that people like to point at that (and less viable examples like ChromeOS or Android) as examples that desktop Linux is already great and intuitive and novice-friendly, and that's just not realistic. I've run Linux on multiple platforms on and off since the 90s, and to this day the notion of getting it up and running on a desktop PC with mainstream hardware feels like a hassle and the idea of getting it going in a bunch of more arcane hardware, like tablet hybrids or laptops with first party drivers just doesn't feel reasonable unless it's as a hobbyist project.

    Those things aren't comparable.