I think the confusion comes from the meaning of stable. In software there are two relevant meanings:
Unchanging, or changing the least possible amount.
Not crashing / requiring intervention to keep running.
Debian, for example, focuses on #1, with the assumption that #2 will follow. And it generally does, until you have to update and the changes are truly massive and the upgrade is brittle, or you have to run software with newer requirements and your hacks to get it working are brittle.
Arch, for example, instead focuses on the second definition, by attempting to ensure that every change, while frequent, is small, with a handful of notable exceptions.
Honestly, both strategies work well. I've had debian systems running for 15 years and Arch systems running for 12+ years (and that limitation is really only due to the system I run Arch on, rather than their update strategy.
It really depends on the user's needs and maintenance frequency.
Disingenuous how? You don't think Linux solved a real day to day need of it's first users?
Sure, from Torvald's perspective, it was a project specifically to solve a small problem he had. He wanted to develop for a nix platform, but Minix wouldn't work on his hardware, and the other Nixs were out of reach.
And this was generally true in the market as well. Linux arrived just in time and was "good enough" to address a real gap, where Minix was limited in scope to basically just education, Hurd was in political development hell, and the other Nixs were targeted at massive servers and mainframes. Linux filled the "Nix for the rest of us, inexpensively" niche, eventually growing in scope to displace its predecessors, despite their decades of additional professionalism and maturity.
That niche is now filled, the gap no longer exists. A "New Linux" wouldn't displace Linux, because the original already suits the needs we have well enough. This is precisely why the BSDs and Solaris were "too little, too late". They were in many ways better than Linux, but the problems they solve compared to Linux are tiny and highly debatable. Linux addressed a huge, day to day need of people who were motivated to help.
Yeah exactly. Toy OSs have only increased in scope, scale, and number. And the public is still completely unaware, because these toy OSs don't solve day to day problems the way that Windows, Mac, and Linux did when they first came to market.
Quora is trash, but this thread has a breakdown of many of Lucas' "inspirations", which show he was always happy to directly copy other's art. Most of it is hilariously blatant.
Medium mode is a fun challenge at first, eventually becoming fairly chill as you advance in skill and confidence.
Hard mode is always fairly hard, especially on harder maps.
There are many resources to manage, but none that feel burdensome.
The game is extremely thematic, it feels alive with charm.
Graphics are excellent, though sometimes graphical glitches can still be encountered.
The water. It's so hard to explain to someone who hasn't encountered this system before, but water is life in this game, and it's both beautiful graphically, and extremely well simulated by physics. Learning to control the water, and see the shortest paths to end water scarcity with beaver engineering is an amazingly fun and unique aspect of the game.
Mods are well supported and the community is vibrant.
Cons:
Not a ton of content. They've been very good about adding new mechanics (badwater, extract, etc) but there's still just 2 races of beaver and a dozen or so maps.
No directed experience. In similar games I've enjoyed a campaign, challenge maps/scenarios, weekly challenges, a deeper progression system, just... Something to optionally set your goals. There's nothing of the sort in the vanilla game. It's fully open ended and there's only one unlock outside of your progress though the resource tree in a map.
All in all, I highly recommend it, especially at the modest asking price. If you love city builders, charming and beautiful art, thematic settings, dynamic challenge, and solution engineering, this is a fantastic game for you.
Other games I've enjoyed that scratch similar itches:
KSP
Cities: Skylines (but Timberborn has been far more compelling)
Factorio
Mindustry
Planet Zoo (Timberborn has less of a directed experience, but is otherwise completely superior)
Gnomoria
Banished
Tropico series (though I view this as more casual)
Author doesn't seem to understand that executives everywhere are full of bullshit and marketing and journalism everywhere is perversely incentivized to inflate claims.
But that doesn't mean the technology behind that executive, marketing, and journalism isn't game changing.
Full disclosure, I'm both well informed and undoubtedly biased as someone in the industry, but I'll share my perspective. Also, I'll use AI here the way the author does, to represent the cutting edge of Machine Learning, Generative Self-Reenforcement Learning Algorithms, and Large Language Models. Yes, AI is a marketing catch-all. But most people better understand what "AI" means, so I'll use it.
AI is capable of revolutionizing important niches in nearly every industry. This isn't really in question. There have been dozens of scientific papers and case studies proving this in healthcare, fraud prevention, physics, mathematics, and many many more.
The problem right now is one of transparency, maturity, and economics.
The biggest companies are either notoriously tight-lipped about anything they think might give them a market advantage, or notoriously slow to adopt new technologies. We know AI has been deeply integrated in the Google Search stack and in other core lines of business, for example. But with pressure to resell this AI investment to their customers via the Gemini offering, we're very unlikely to see them publicly examine ROI anytime soon. The same story is playing out at nearly every company with the technical chops and cash to invest.
As far as maturity, AI is growing by astronomical leaps each year, as mathematicians and computer scientists discover better ways to do even the simplest steps in an AI. Hell, the groundbreaking papers that are literally the cornerstone of every single commercial AI right now are "Attention is All You Need" (2017) and
"Retrieval-Augmented Generation for Knowledge -Intensive NLP Tasks" (2020). Moving from a scientific paper to production generally takes more than a decade in most industries. The fact that we're publishing new techniques today and pushing to prod a scant few months later should give you an idea of the breakneck speed the industry is going at right now.
And finally, economically, building, training, and running a new AI oriented towards either specific or general tasks is horrendously expensive. One of the biggest breakthroughs we've had with AI is realizing the accuracy plateau we hit in the early 2000s was largely limited by data scale and quality. Fixing these issues at a scale large enough to make a useful model uses insane amounts of hardware and energy, and if you find a better way to do things next week, you have to start all over. Further, you need specialized programmers, mathematicians, and operations folks to build and run the code.
Long story short, start-ups are struggling to come to market with AI outside of basic applications, and of course cut-throat silicon valley does it's thing and most of these companies are either priced out, acquired, or otherwise forced out of business before bringing something to the general market.
Call the tech industry out for the slime is generally is, but the AI technology itself is extremely promising.
Seriously. This guy thinks that regulators would have stepped in to stop OpenAI or Microsoft from acquiring a no-name 2 year old startup with two rounds of funding?
Sin's is a game my friends and I always come back to. Such a dynamic rts with so many ways to win.
The expansions are fairly priced and also one person having an expansion is enough to host an expansion game for everyone who has any version installed.
I think the confusion comes from the meaning of stable. In software there are two relevant meanings:
Debian, for example, focuses on #1, with the assumption that #2 will follow. And it generally does, until you have to update and the changes are truly massive and the upgrade is brittle, or you have to run software with newer requirements and your hacks to get it working are brittle.
Arch, for example, instead focuses on the second definition, by attempting to ensure that every change, while frequent, is small, with a handful of notable exceptions.
Honestly, both strategies work well. I've had debian systems running for 15 years and Arch systems running for 12+ years (and that limitation is really only due to the system I run Arch on, rather than their update strategy.
It really depends on the user's needs and maintenance frequency.