It also sets context length to 2k by default iirc, which breaks a lot of tasks, and gives a general bad first impression to users who are likely using local models for the first time.
The statement says clinical evidence shows that once a patient has declined in motor skills and language functions by a certain amount, the drug is no longer a benefit in slowing the progression of the disease.
Non-healing consumables are always either so strong they're required to finish the game (Terraria) or so weak that they may as well be a placebo (Cyberpunk).
Exchange has always been done with IOUs. Even when bartering was the meta they still exchanged promissory notes for larger scale transactions where they didn't have the goods on hand.
Rather than CPUs I think these are a much bigger deal for GPUs where memory is much more expensive. I can get 128GB of ram for 300CAD, the same amount in vram would be several grand.
You're licking the boots of a company that uses the work of others without compensation or credit then sells it back to you at a premium. This is the exact behavior the GPL license aimed to prevent. I have nothing against the technology if it's made with permission and benefits the people it depends on, but that's clearly not the case here.
Ludites are an apt comparison. The Luddites fought to protect their industry from industrialists who aimed to replace them with cheap, low skilled and child labour. The goal of AI isn't advancement it's replacement, and most of the companies pushing it are transparent about that.
Seems pretty underwhelming. They're comparing a 109B to a 27B and it's kind of close. I know it's only 17B active but that's irrelevant for local users who are more likely going to be filtered by memory rather than speed.
They'll sell each of them off to be run into the ground by some other billionaires. Both are heavily subsidized by Google's ad business which is still somewhat unobtrusive up front. As much as Google's services have degraded, it will be much worse with another company at the helm trying to squeeze as much value out of their investment as possible.
It also sets context length to 2k by default iirc, which breaks a lot of tasks, and gives a general bad first impression to users who are likely using local models for the first time.