This has not been my experience. I can usually tell when my phone bogs down loading an intense website or switching between 3 apps after a couple years. If I want my phone to last a long time and I want to enjoy using it I've found that it's best to start with a device that is as powerful as possible.
I'm currently applying for jobs and I don't even bother with unreasonable ranges. I have a target salary so I won't play games if the low end of your range is half that.
I feel like you keep misrepresenting what I'm saying. Nowhere did I say that our brains work completely and exactly the same as AI. However, we do learn in much the same way. By amortizing small amounts of information and drawing connections between them
Well, what you described is simply not a perfect recollection. It is many small tidbits of information that combined together can make a larger output.
No, they take exponentially increasing resources as a consequence of having imperfect recall. Smaller models have "worse" recall. They've been trained with smaller datasets (or pruned more).
As you increase the size of the model (number of "neurons" that can be weighted) you increase the ability of that model to retain and use information. But that information isn't retained in the same form as it was input. A model trained on the English language (an LLM, like ChatGPT) does not know every possible word, nor does it actually know ANY words.
All ChatGPT knows is what characters are statistically likely to go after another in a long sequence. With enough neurons and layers combined with large amounts of processing power and time for training, this results in a weighted model which is many orders of magnitude smaller than the dataset it was trained on.
Since the model weighting itself is smaller than the input dataset, it is literally impossible for the model to have perfect recall of the input dataset. So by definition, these models have imperfect recall.
This is incorrect actually. The models these AIs run from by definition have imperfect recall otherwise they would be ENORMOUS. No, that's actually exactly the opposite of how these work.
They train a statistically weighted model to predict outputs based on inputs. It has no actual image data stored internally, it can't.
I'm not sure that is a fair reaction. If your workflow relies heavily on many complex extensions that have a history of updating slow it is probably worth just... waiting a bit? You don't HAVE to be on the bleeding edge of Gnome releases. With a fairly minimal extensions list I've not had problems updating to new releases for a long long time
there is no API, which is the problem. It's just straight code injection. That's why extensions can be so powerful. A stable API would compromise their freedom for sure
Gnome doesn't have an extension API. That is why it is prone to breakage, since the code is injected into the actual shell. The upshot of this is that extensions can do pretty much anything. The downside is there is no stable API.
Yeah, I made a point of buying a Venstar thermostat because it has a local REST API that I could hook up directly to Home Assistant and control locally. My IoT VLAN doesn't even have Internet access, it's blocked.
A lot of people are missing the big picture here. Look up Starshield. SpaceX is aggressively hiring for the Starlink DoD edition essentially. I suspect the US govt is not as worried about this as you think
I liked it as a general place. Pretty much all smart home stuff works with home assistant so it was nice seeing everyone's ideas and devices regardless of platform.
I'm gonna go against the grain and say this is awesome and I'm excited for it. I love talking to other people about less-popular songs. I think it could really add a cool social aspect
This has not been my experience. I can usually tell when my phone bogs down loading an intense website or switching between 3 apps after a couple years. If I want my phone to last a long time and I want to enjoy using it I've found that it's best to start with a device that is as powerful as possible.