Exactly. This is like if you charged the average person $1 for causing a major motorway accident.
It's a joke of a fine in the face of Boeing's profits - basically telling them they can get away with severe and wreckless disregard for human life in return for just over a week's profits.
Tell you what it certainly is a "special" military operation considering that it's taken over a year, and at this point requires conscription... An awful lot like if Russia had started a war
As others have said, our reference of time comes from our own universe's rules.
Ergo if rendering 1 second of our time took 10 years of their time, we wouldn't measure 10 years, we'd measure 1 second, so we'd have no way of knowing.
It's worth remembering that simulation theory is, at least for now, unfalsifiable.
By it's nature there's always a counterargument to any evidence againat it, therefore it always remains a non-zero possibility, just like how most religions operate.
I mean it's a beautiful name, who really cares if it's named after a genus of Cicadas?
There are worse sounding "normal" names out there.
Plus it's named after OP's passion, I think that shows a lot of love
I didn't mean in terms of providing. I meant that if someone provided a base model, someone took that, built upon it, then used it for a harmful purpose - of course the person modified it should be liable, not the base provider.
It's like if someone took a version of Linux, modified it, then used that modified version for an illegal act - you wouldn't go after the person who made the unmodified version.
SB 1047 is a California state bill that would make large AI model providers – such as Meta, OpenAI, Anthropic, and Mistral – liable for the potentially catastrophic dangers of their AI systems.
Now this sounds like a complicated debate - but it seems to me like everyone against this bill are people who would benefit monetarily from not having to deal with the safety aspect of AI, and that does sound suspicious to me.
Another technical piece of this bill relates to open-source AI models. [...] There’s a caveat that if a developer spends more than 25% of the cost to train Llama 3 on fine-tuning, that developer is now responsible. That said, opponents of the bill still find this unfair and not the right approach.
In regards to the open source models, while it makes sense that if a developer takes the model and does a significant portion of the fine tuning, they should be liable for the result of that...
But should the main developer still be liable if a bad actor does less than 25% fine tuning and uses exploits in the base model?
One could argue that developers should be trying to examine their black-boxes for vunerabilities, rather than shrugging and saying it can't be done then demanding they not be held liable.
Could've told you that 10 years ago. Literally the moment online came out and they suddenly they stopped talking about story DLCs, I knew we weren't getting them.
The story-mode campaign is now just a gateway drug to their online cash-cow - I bet you it'll be the same for GTA 6
The Ukrainian people are out there fighting tooth and nail to survive the Russian onslaught, yet you're in your armchair acting like they should just give up because Russia wants them to.
A law like that would've been incredibly helpful back when those Brexit buses were claiming to somehow give the NHS £350M a week, most of which technically never even existed in the first place (as we got back something like 200M a week).
Having Farage, Boris, and their cronies be forced to resign (or even face prison) would've been a damn delight.
Thank fuck I got away from Authy years ago - cost me my Twitch account (because apparently Twitch straight won't allow you to switch away from Authy), but it was worth it to secure the rest of my things
That was my first thought too. It was only when the pseudoscience started that I realised I was meant to side with Starbucks girl here, and not the soldier helping out sewer man
There are so many questions.
How did the soldier know there was a sewer man?
How did the soldier know the sewer man wanted coffee?
How did he know that coffee was cold enough for the sewer man to drink?
I'm less surprised by the GOP's willingness to sink humanity for the sake of short-term profits than I am that Trump takes 45 minutes to wash that mop he calls his hair
No, but it's on a similar scale as far as Boeing is concerned.