Yah, nvm.
Muehe @ Muehe @lemmy.ml Posts 9Comments 243Joined 2 yr. ago
Alright, so you don't acknowledge the problem, still fits the definition.
Fair enough on most of those areas you mentioned by the way, wars, economical depression, and the pandemic have been exaggerated, with the serious caveat on the latter that that was unclear at the time so you had to err on the side of caution. But it's kind of the opposite with climate change IMHO. Scientists have kept to rather conservative projections so as to not cause panic and apathy in the general public, but over the last years new measurements have outpaced those predictions at practically every step of the way.
Since most of what I would have said has already be mentioned I will just go with almost anything under the umbrella of the KDE organization.
As in the Plasma desktop environment and the whole application suite. Includes programs like Krita, Kdenlive and KDE Connect, plus the whole range of "standard" desktop applications like terminal, file manager, document viewers, etc. pp.
And the DE itself is just adorably hackable. Want to replace the Kwin window manager with i3? Sure it's possible, here you go: https://userbase.kde.org/Tutorials/Using_Other_Window_Managers_with_Plasma
Can confirm this still worked for me as off two month ago.
I’m almost the same as OP, but I wouldn’t call it “head in the sand”. [...] Stop following the news, and you won’t notice a thing.
Well I understand the impetus, but that's literally the definition of the head-in-the-sand idiom according to Merriam-Webster:
unwilling to recognize or acknowledge a problem or situation
Could you elaborate on what you mean about the development of deep learning architecture in recent years?
Transformers. Fun fact, the T in GPT and BERT stands for "transformer". They are a neural network architecture that was first proposed in 2017 (or 2014 depending on how you want to measure). Their key novelty is the method of implementing an attention mechanism and a context window without recursion, which was the method most earlier NNs used for that.
The wiki page I linked above is admittedly a bit technical, this articles explanation might be a bit more friendly to the layperson.
LLMs aren’t really novel in terms of theoretical approach: the real revolution is the amount of computing power and data to throw at them.
This is 100% true. LLMs, neural networks, markov chains, gradient descent, etc. etc. on down the line is nothing particularly new. They’ve collectively been studied academically for 30+ years.
Well LLMs and particularly GPT and its competitors rely on Transformers, which is a relatively recent theoretical development in the machine learning field. Of course it's based in prior research, and maybe there even is prior art buried in some obscure paper or 404 link, but if that's your measure then there is no "novel theoretical approach" for anything, ever.
I mean I'll grant that the available input data and compute for machine learning has increased exponentially, and that's certainly an obvious factor in the improved output quality. But that's not all there is to the current "AI" summer, general scientific progress played a non-minor part as well.
In summary, I disagree on data/compute scale being the deciding factor here, it's deep learning architecture IMHO. The former didn't change that much over the last half decade, the latter did.
Looks cool, but somehow labeling the ethernet port as "1000 Mbps WLAN" in their marketing material doesn't inspire confidence.
Semi-off-topic rant incoming, but hard disagree on this one. This is a really weird statement that is commonly used for the opposite of what it actually means (although not in this case). I don't enjoy syntactical discussion, e.g. whether I used the wrong sentence structure or whatever, as long as the meaning is clear. But discussion on "the meaning of words", i.e. their semantics, is absolutely necessary in many cases, here about whether we use the same definition of this idiom. You can't properly communicate without that, so if you don't discuss semantics where appropriate you are talking at each other instead of with each other, despite using the same language.
Case in point here, you are operating from your intuitive definition of the head-in-the-sand idiom which doesn't fit the situation at hand, I'm operating from the Merriam-Webster definition which does fit the situation at hand.
Just to be clear, I don't intend any judgement here, just saying it fits that one specific definition of this idiom, which is why I quoted it originally.
As stated in the grandparent of this comment I can agree with many of your examples, so I understand your revulsion of categorising your behaviour as sticking your head into the sand. But to people who recognise and acknowledge the problem, unlike you who recognises but doesn't acknowledge the problem, you are sticking your head into the sand.