Reddit says it is not covered by new Online Safety Code as it has moved its jurisdiction to the Netherlands
SirEDCaLot @ SirEDCaLot @lemmy.today Posts 0Comments 545Joined 2 yr. ago
On the censorship thing, maybe it is okay if an online messaging website bans certain content, like pro-suicide content, or pro-terrorism content, etc. You could call that censorship but you could also call it safety.
I think that should go either way and I have no problem if a platform decides to ban that kind of stuff. I certainly have no desire to consume such material.
I have a BIG problem when the government decides that platforms are required to ban things. Even if they're things I don't myself want to read.
It's a slippery slope.
This is actually a real problem more so in this case than most. There's an awful lot of satellites in low Earth orbit, altitude of a few hundred to several hundred kilometers. Atmospheric drag still exists here a little bit, and thus space junk will reenter and burn up in years or decades.
This satellite was in geostationary orbit, at an altitude of about 36,000 km. Debris up there can take hundreds of years to come down. Geostationary is a special altitude where the satellite orbits at exactly the same rate as the Earth spins. That means that a fixed dish on Earth will always point at the satellite without needing to move or track. So there's just one narrow orbital ring around the equator for that. That ring is not a place we want space junk to be, because if it gets too hazardous for satellites in GEO that basically removes our capability as a species to use fixed satellite dishes for anything. And that problem won't go away for centuries.
In concept I agree with him on that. I support your right to say awful shit, but I am not going to spread that message to others. Where Elon lost the plot was thinking of Twitter as a public square. It's a nice thought, but it requires the whole platform to be 100% neutral and unbiased. So it's all good to call Twitter the public square, but that's a lot harder to take seriously when the guy in charge of policing the square is heavily biased.
how can we protect people without this censorship?
We don't, nor should we try to.
Protecting people's feelings from offense is not a valid activity in a free society. The second you start down the road of 'we must regulate this guy's words and actions to protect that guy's feelings' we become a nanny state full of people with paper thin skins.
We accept that one consequence of free speech is that sometimes people will say things that are hurtful. We do that because the alternative is getting rid of free speech.
Hate must be addressed at its root.
I could not agree more. Fighting hatred with hatred only breeds more hatred. But that seems to be the standard strategy today, it's okay to not just refuse to tolerate intolerance, but to be actively intolerant of those who themselves seem intolerant. It is just fighting bad with bad and the result is more bad.
The way we fight the roots of hatred is with open discourse. The people who have hate in their hearts, we do not isolate them, we do not wall them off from society, we do not practice and encourage intolerance against them. We show them a better way. We make ourselves examples of doing better, not just against the people they don't like, but against the people we don't like.
We try to build bridges and encourage communication. For all the people who say immigrants are lazy lawbreakers, we show them immigrants who are the hardest working motherfuckers there are and pay their taxes. For the people who think black people are a problem, we introduce them to black people who break their stereotypes.
For the overwhelming majority of people who have hate in their hearts and intolerance and prejudice, those feelings are based on stereotypes.
People don't join the KKK because they start in a mixed culture and then conclude black people are a problem. They join the KKK because they have stereotypes they see reinforced in media and TV.
There was a famous Black dude whose name I don't remember, but he of his own volition managed to deprogram a whole bunch of KKK members. All he did was sit down and fucking talk to them. That's it. Like sit down at the bar next to them and start a conversation. Many of the KKK members had never encountered a respectable well-spoken black person before (let alone one willing to talk to them) and were completely blown away because it broke the stereotype of a black person that they joined the KKK to fight against.
A good number of them ended up leaving the KKK and giving this man their robes on the way out. So there's this friendly black dude who has a big box of KKK robes that were given to him by ex-members he deprogrammed.
That is how we fight hate. We fight hate with love, we fight intolerance with tolerance and open arms, we fight stereotypes with exposition, we fight ignorance with knowledge.
Otherwise it's like we are saying there's too much stupidity in society so we're going to prevent people with lower IQs from attempting school. It doesn't work.
There's a big difference between hate speech and revenge porn.
A person has rights to their likeness and image. That's why anybody who goes in front of a camera, be it a porn star or a model or an actor, signs a 'model release' giving the photographer authorization to publicize and sell their images. Without that simple one page contract, nothing in the photo shoot can be published. Porn actors do that. And in fact, they usually do it on video, where the actor holds up their driver's license and says 'my name is blah blah I am a pornographic actor and I am consenting to have sex on camera today and authorize this production company to publicize and sell the resulting video' or something like that. Revenge porn victims have made no such agreement, and while the penalties are stronger because of the harm it causes them, the legal basis for having any penalty at all is simply that they did not consent to having their likeness and image publicized.
Hate speech has no such issue. It may be harmful to a person or group, but if you remove the very broad 'hatred' label, it becomes just an opinion that would otherwise be protected speech.
The other problem is that what considers hatred is very much subjective. For example, if I say wanting to own a gun is evidence of mental illness, a lot of people on Lemmy will agree with that and I will probably get upvotes. If I say wanting to use the bathroom of other than your biological genetic sex is evidence of mental illness, I will probably get banned. What is the difference between the two? Supporting LGBT rights is popular, supporting the second amendment is not. So you create the situation where the only difference between a valid opinion and an invalid one is whether or not it's accepted mainstream, and that's a bad way to go.
Also, in a free country, it is generally considered that expressing an opinion which may be detrimental to others is not in itself considered bad. If I say that people over 80 years old should require a yearly driving test, that's a valid position for me to have and nobody will call me ageist for saying it. If I say that Donald Trump should be arrested rather than elected, that is directly detrimental to a person but it would get me upvotes here. If I said that being Republican is evidence of mental illness, that is directly prejudicial against an entire group which has many different reasons for believing as they do, and it would probably get me upvotes also.
My point is, hate speech as a concept is difficult to define and when you try to ban it with censorship you are just starting down a slippery slope that will have the opposite of the desired effect. You legitimize the counterculture and do nothing to stop the real problem, the actual hatred.
You are addressing the wrong problem. You're focusing on the symptom rather than the disease.
Fighting hate speech rather than hatred itself only strengthens the hatred. As soon as you say "you mustn't say that" you validate the hatred and give it power. Look at any counterculture, positive or negative. Trying to suppress it only validates it, gives it legitimacy as being important enough for the establishment to want to suppress, and if the people who might support the hatred already don't like the people who would suppress the hate speech, you've just poured fuel on the fire.
The problem to be fixed isn't hate speech, it's hatred. It's a tougher problem to solve, but a much more important one that you will actually get a productive effect by solving it.
That I would actually very much agree with. As Elon himself said in the early days of the Twitter takeover, "free speech does not mean free reach".
This is also why I think engagement algorithms are a cancer on our civilization. If it is in a platforms monetary interest to amplify the most vile anger inducing stuff, be that stuff that is actively bad like hate speech or simply divisive like a lot of political crap, that is bad for our society. It pushes us farther apart when we should be coming together to fix the problems that we can agree on.
You don't have to be a porn star or even a porn consumer to oppose laws banning porn.
And you don't have to be a shitbag to recognize that, while well-intentioned, censorship is still censorship.
I have absolutely no love whatsoever for the people who would spread such crap. I would love to get rid of it. But banning the speech doesn't do that. It's like smashing the altimeter in the airplane and then declaring that you're not crashing anymore. But the reality is, smashing instruments in the airplane is never a great idea whether you are crashing or not. It just prevents you from seeing things you don't want to. And you get hurt in the process.
Censorship, historically, has never ended up anywhere good.
Absolutely fuck spez.
But he's right here. Just because he's a fuckstick doesn't mean he's always wrong on every issue 100% of the time.
Various forms of censorship under the flag of 'online safety' have been pushed by governments since the internet began to exist. And before that with print media and television. Censorship is not the answer. Never was. First it was for porn, then it was for video games, then it was for hate speech, it's always something.
But in the words of Captain Jean-Luc Picard,
"With the first link, the chain is forged. The first speech censured, the first thought forbidden, the first freedom denied, chains us all irrevocably."
Censorship must be opposed.
Not really because their rights have not been violated, nothing was stolen from them. They were presented with a software product that had a limited license, and they accepted that. As far as they are concerned, the developer has fulfilled their contractual obligation to them; they were never offered a GPL license so they got exactly what they were offered.
The author of the GPL'd code however is another story. They wrote software distributed as GPL, Winamp took that code and included it without following the GPL. Thus that author can sue Winamp for a license violation.
Now if that author is the only one who wrote the software, the answer is simple- Llama Group pays them some amount of money for a commercial license of the software and a contract that this settles any past claims.
However if it's a public open source project, it may have dozens or hundreds of contributors, each of which is an original author, each of which licensed their contribution to the project under GPL terms. That means the project maintainer has no authority to negotiate or take payments on their behalf; each of them would have to agree to that commercial license (or their contributions would have to be removed from the commercial version of the software that remains in Winamp going forward). They would also each have standing to sue Llama Group for the past unlicensed use of the software.
Unless you are one of the original developers who wrote the GPL code included in Winamp, you have no standing to sue them anyway.
Not necessarily. It means that Llama group, and perhaps the original Nullsoft, have violated the license of whatever open source developer wrote that code originally. So the only ones who could actually go after them to force anything are the ones who originally wrote that GPL code. They would basically have to sue Llama group, and they might also have a case against Nullsoft / AOL (who bought Nullsoft) for unjust enrichment over the years Winamp was popular.
Chances are it would get settled out of court, they would basically get paid a couple thousand bucks to go away. Even if they did have a legal resources to take it all the way to a trial, it is unlikely the end result would be compelling a GPL release of all of the Winamp source. Would be entertaining to see them try though.
Complicating that however, is the fact that if it's a common open source library that was included, there may be dozens of 'authors' and it would take many or all of them to agree to any sort of settlement.
Here's the story:
Company buys the rights to Winamp, tries to get the community to do their dev work for free, fails. That's it.
The 'Winamp source license' was absurdly restrictive. There was nothing open about it. You were not allowed to fork the repo, or distribute the source code or any binaries generated from it. Any patches you wrote became the property of Llama Group without attribution, and you were prohibited from distributing them in either source or binary form.
There were also a couple of surprises in the source code, like improperly included GPL code and some proprietary Dolby source code that never should have been released. The source code to Shoutcast server was also in there, which Llama group doesn't actually own the rights to.
This was a lame attempt to get the community to modernize Winamp for free, and it failed.
Of course many copies of the source code have been made, they just can't be legally used or distributed.
Today I learned. Thanks for that!
There's currently no way to delete an uploaded image.
That's especially problematic since pasting any image into a reply box auto-uploads it. So if your finger slips and you upload something sensitive, or if you want to take down something you uploaded previously, there's no way to do it.
What should happen is whenever you upload an image, the image and delete key get stored in some special part of your Lemmy account. Then from the Lemmy account management page you can see all your uploaded images and delete them individually or in bulk.
So it seems you can now do this- Profile, Uploads shows you all your uploads. Go Lemmy!
No it's actually pretty simple. No containers. Your passkeys can be managed in the browser (Google Passwords), by a plug-in like BitWarden, or in a third party hardware device like YubiKey.
And, with respect, this view is more naive (IMHO) because it's focused by size of company, and you can't do that. You can't have one set of laws for small companies and another set of laws for large companies.
So if Google has to pay to link to IA, then so does DuckDuckGo and any other small upstart search engine that might want to make a 'wayback machine this site!' button.
Google unquestionably gets value from the sites they link to. But if that value must be paid, then every other search engine has to pay it also, including little ones like DDG. That basically kills search engines as a concept, because they simply can't work on that model.
Thus I think your view is more naive, because you're just trying to stick it to Google rather than considering the full range of effects your policy would have.
Strong disagree. If I make a website people like, and Google links to it, should Google have to pay me? If so, Google basically can't exist. The record keeping of tracking every single little website that they owe money to or have to negotiate deals with would be untenable. And what happens if a large tech journal like CNET or ZDNet Links to the website of a company they are writing an article about? Do they have to pay for that? Is the payment assumed by publicity? Is it different if they link to a deep page versus the front page?
What you are talking opens up a gigantic can of worms that there is no easy solution to, if there is any solution at all.
I will absolutely give you that what Google is doing is shitty. If Google is basically outsourcing their cache to IA, they should be paying IA for the additional traffic and server load. But I think that 'should' falls in line with being a good internet citizen treating a non-profit fairly, not part of any actual requirement.
I had not realized that. They should absolutely be allowed to do it, but it's super shitty of them to basically offload that cost onto IA. IA of course would be well within their rights to try and monetize it. Look at incoming traffic that deep links a cached page and has a Google.com referrer, and throw a splash page or top banner asking for donation.
That's absolutely the one! Truly great American. We could all learn a thing or two from him.