Not that I remotely want to defend reddit, but from a development standpoint it's much easier to maintain and secure a single login workflow. Whatever nonsense the new/old font ends require, it's probably much easier to make that work with a single unified token than it is to maintain both separately.
As long as the login remembers which frontend it came from, I wouldn't be too up in arms here. If it dumps you out to www/new.reddit then that's completely fucked and there's no excuse.
Yeah I'm really curious what his take is going to be on this one lol. Technically it doesn't have a layer-2 capable bridge mode like other VPN solutions like openvpn, but that's about all I can think of. It's still objectively a virtual network, made private by a keypair exchange.
Probably just blindly paroting something someone told him. Awkward way to learn that one lmao.
I would guess that it goes off of the lowest common denominator between IP address geo-location & billing address. If either of those say US, google/apple would probably be required not to distribute it.
It is possible to both be anti-chinese government and also want comprehensive privacy laws in the US. Like, I absolutely buy that the Chinese government has access to tiktok data. I, however, don't think forcing a sale is the right way to deal with any of this. Comprehensive privacy and data collection laws would go much farther towards making it so it doesn't really matter who owns what.
unless the bill has changed since the last time I read it, there were fines for hosting the service in US datacenters, and fines for companies allowing US data to exist in non-us datacenters. I don't think you could interpret the bill as imposing a civil penalty to a user using a vpn and accessing it.
That would be true, it'd be pretty difficult to build a model without any pictures of children at all, and then try and describe to the model how to alter an adult to make a child. Is anyone asking for that though? To make it illegal to have regular pictures of children in these datasets?
I'm not going to say that csam in training sets isn't a problem. However, even if you remove it, the model remains largely the same, and its capabilities remain functionally identical.
Yeah, turns out when the monopolies are eliminated, people get more competition and a better deal on the consumer end. It's why I'll never understand people who say streaming services became as bad as cable.
I'd argue that streaming is in such a bad place right now because each streaming service has a monopoly on their own content. Sure, you could argue that studios "compete" with each other on the content they produce, but I'd argue that cable companies were a different layer of the stack entirely. Cable companies all offered the same channels and the same content, and in areas where they did overlap, competition to offer the best delivery of those channels was great. What made cable bad was that there was little incentive for companies to geographically compete. In the era of streaming, companies have little incentive to allow their content to compete across platforms.
If you ask me, every streaming platform should be broken up from their production parents, so that streaming companies can compete on what they offer, and how they deliver it. There is no incentive for the platforms themselves to compete with each other. It's all about how hard the services can enshittify before people stop watching the content they have a monopoly on.
You should consider reversing the roles. There's no reason your homelab cannot be the client, and have your vps be the server. Once the wireguard virtual network exists, network traffic doesn't really care which was the client and which was the server. Saves you from opening a port to attackers on your home network.
It doesn't need csam data for training, it just needs to know what a boob looks like, and what a child looks like. I run some sdxl-based models at home and I've observed it can be difficult to avoid more often than you'd think. There are keywords in porn that blend the lines across datasets ("teen", "petite", "young", "small" etc). The word "girl" in particular I've found that if you add that to basically any porn prompt gives you a small chance of inadvertently creating the undesirable. You have to be really careful and use words like "woman", "adult", etc instead to convince your image model not to make things that look like children. If you've ever wondered why internet-based porn generators are on super heavy guardrails, this is why.
TL/DW for those of us who don't learn well from video content?