DnD
r00ty @ r00ty @kbin.life Posts 2Comments 1,265Joined 2 yr. ago

I don't think it's rose-tinted glasses really. I think it's just the change in dynamic. It was definitely different during the "real" classic times (I would say classic to Wrath).
In 2005 when I started playing you needed to group up to get things done really. When you did this you met people. You talked, not with a microphone, but you would be talking. You'd get to know people, they'd invite you to dungeon groups and vice-versa, it would widen both of your in game circles and so on.
When I got to the position to raid, I was on an RP-PvP realm and while there were raiding guilds, many people were in smaller guilds that were either role-playing or guilds of friends. So, there were often raiding groups. I was in one of these, and we had our own guild chat-esque thing that everyone in the group could chat through and of course raids were mandatory voice. Because generally you did need to have communications to raid. This increased your in game circle too.
I still speak to some people now, on social media in various forms that I played the game with in 2005-2010. Some I met, others I never did. I've not really played retail much for a while now. But, it's not the same. To an extent, neither is classic now.
Now, probably an unpopular opinion because I think a lot of people think Blizzard's actions led to this change in community spirit. I actually think it's the other way round. I think they saw their player-base changing, and adjusted the game to suit. The side effect is that it put off some of those with a more social gaming mindset for good. But, it would have happened anyway.
Times change, and they just rolled with it.
I mean, you say that. But I think there's money to be made here. If we just create a new name for pasteurisation processes and market it as "that thing" raw milk. Of course with a 200% markup. Free money!
I'm on NVidia with blob driver, KDE Plasma on wayland on Arch. Yeah, standby to resume is like 50/50 the screen will come back. I just turned off stand-by and kept screen sleep only.
But I'm on desktop so less of a problem for me than it would be for a laptop user.
OK, look back at the original picture this thread is based on.
We have two situations.
The first is a dedicated system for providing navigation and other subsystems for a very specific purpose, with very specific hardware that is very limited. An 8 bit CPU with a very clearly known RISCesque instruction set, 4kb of ram and an bus to connect devices.
The second is a modern computer system with unknown hardware, one of many CPUs offering the same instruction set, but with differing extensions, a lot of memory attached.
You are going to write software very differently for these two systems. You cannot realistically abstract on the first system, in reality you can't even use libraries directly. Maybe you can borrow code from a library at best. On the second system you MUST abstract because, you don't know if the target system will run an Intel or Amd CPU, what the GPU might be, what other hardware is in place, etc etc.
And this is why my original comment was saying, you just cannot compare these systems. One MUST use abstraction, the other must not. And abstractions DO produce overhead (which is an inefficiency). But we NEED that and it's not a bad thing.
Exactly my point though. My original point was that you cannot compare this. And the main reason you cannot compare them is because of the abstraction required for modern development (and that happens at the development level and the operating system you run it on).
The Apollo software was machine code running on known bare metal interfacing with known hardware with no requirement to deal with abstraction, libraries, unknown hardware etc.
This was why my original comment made it clear, you just cannot compare the two.
Oh one quick edit to say, I do not in any way mean to take away from the amazing achievement from the apollo developers. That was amazing software. I just think it's not fair to compare apples with oranges.
It does. It definitely does.
If I write software for fixed hardware with my own operating system designed for that fixed hardware and you write software for a generic operating system that can work with many hardware configurations. Mine runs faster every time. Every single time. That doesn't make either better.
This is my whole point. You cannot compare the apollo software with a program written for a modern system. You just cannot.
Wait a second. When did I say abstraction was bad? It's needed now. But when you are comparing 8bit machine code written for specific hardware against modern programming where you MUST handle multiple x86/x86_x64 cpus, multiple hardware combinations (either via the exe or by the libraries that must handle the abstraction) of course there is an overhead. If you want to tell me there's no overhead then I'm going to tell you where to go right now.
It's a necessary evil we must have in the modern world. I feel like the people hating on what I say are misunderstanding the point I make. The point is WHY we cannot compare these two things!
That said there could be better ways to show info like this on the fediverse. Except, it's complicated.
You could be banned on an instance, but also separately banned on an individual community on an instance, or your instance could be defederated from one running a community. Any of which could lock you out in theory.
It's not as clear cut as closed systems. This is mostly a good thing but for clarity, not so much :p
Except it's not nonsense. I've worked in development through both eras. You need to develop in an abstracted way because there are so many variations on hardware to deal with.
There is bloating for sure, and of course. A lot is because it's usually much better to use an existing library than reinvent the wheel. And the library needs to cover many other use cases than your own. I encountered this myself, where I used a Web library to work with releases on forgejo, had it working generally, but then saw there was a library for it. The boilerplate to make the library work was more than I did to just make the Web requests.
But that's mostly size. The bloat in terms of speed is mostly in the operating system I think and hardware abstraction. Not libraries by and large.
I'm also going to say legacy systems being papered over doesn't always make things slower. Where I work, I've worked on our legacy system for decades. But on the current product for probably the past 5-10. We still sell both. The legacy system is not the slower system.
I did a routine upgrade on my mbin server, where I had an old version with changes I made myself.
Well turns out I upgraded something (probably redis) that broke symfony that broke everything.
So I had a fun afternoon upgrading to the latest mbin version. I mean I needed to anyway but my hand was forced.
Yep sometimes an innocent looking update will change your weekend plans.
Anyways, any reason not to use ssh?
It's a different world now though. I could go into detail of the differences, but suffice to say you cannot compare them.
Having said that, Windows lately seems to just be slow on very modern systems for no reason I can ascertain.
I swapped back to Linux as primary os a few weeks ago and it's just so snappy in terms of ui responsiveness. It's not better in every way. But for sure I never sit waiting for windows to decide to show me the context menu for an item in explorer.
Anyway in short, the main reason for the difference with old and new computer systems is the necessary abstraction.
So, I'm going to come to their defence a bit here. Most of this is also covered in my comment I made further into the thread.
I don't think previous generations were any less financially literate on average. You've always had those careful with credit and those that didn't seem to care, or didn't understand the ramifications of their decisions.
I grew up in the 80s and 90s and most large stores had their own store credit system with 30%+ APR rates. Plenty of people that were boomers or gen X had those accounts, and would routinely buy more whenever they cleared their credit a little.
You could also get credit cards and each card in terms of spending power would have similar limits to what you have now. And there was no shortage of people that would be sitting on their credit limit all the time. I knew people in the 90s that had no idea how interest worked and would be sitting on their credit limit paying back mostly interest all the time.
I think the difference is the ease with which you can gain access to credit now.
In the 80s and 90s you generally needed to go into the store to get their credit. You needed to go to a bank or fill in paperwork in the post to get credit cards. Crucially here, generally there were less providers of credit. Credit cards were often offered by banks, there were not so many resellers of credit. To gain a line of credit you had no chance to ever repay took more effort and as such wasn't so much of a problem as it is now. It was still a problem, and companies routinely made money from the financially illiterate, even then.
What I think is different now, is that you can get credit from a few screen swipes on your phone now. There's many many more providers of credit too. As such, the ability to get into an irreversible credit position is much easier. I would put money down that the same people with £1000s in various store credit/cards all compounding interest at 30%+ in the 90s, would also be in huge debt if credit were as easy to get then, as it is now.
I am going to blame financial institutions more here (those getting into the mess are not entirely free of blame). There might be over a thousand sources of credit now, but they all funnel up to a handful of large finance institutions, and they're the ones really burying their head in the sand pretending they don't know this is happening and couldn't do anything to stop it. They most certainly could prevent it, if they wanted to. It just works better for them to have a generation that is constantly paying interest on never repayable debt. Even factoring in the few that will be written off.
Yes, ultimately we all have our own responsibility not to get into these situations. But I don't think Gen Z or any generation are or were better at this on average. It's just the conditions that allow it have changed, and continue to change.
Buy now pay later has been a thing since at least 2006 in the UK (I can find pictorial evidence for this with a flyer with "buy now, pay 2007"). But, I am quite sure I remember seeing this in the 80s and 90s too. For sure most large stores had their own credit systems that worked this way.
It's not a new term, and actually I'm going to say that predatory techniques were more common in the 80s and 90s. People were definitely financially illiterate then too. Store credit was very common, I remember a very common APR was 29.9% with some pretty long terms on too. And the store credit system was of course designed in a way you could keep adding purchases, so you were ALWAYS paying this 29.9% year on year.
I think the only real difference was that with payments being far more physical without the internet. You could feel when you borrowed too much and people would cut back before reaching truly unrecoverable situations.
Point being, this isn't a new thing. The virtualisation of everything I think has just made it much easier for young people now to get into situations they cannot easily get out of.
In the 80s and 90s you could easily get multiple credit cards. But usually you needed to go out and get them, or at least fill in paper based applications. There were also definitely less institutions offering them. So there was a real hard limit. Now there's all kinds of ways to get credit. However, there's few real large institutions at the top and I think they really should be coordinating centralised credit limits better.
My summary is, this isn't new. Just the modern world has made it very easy to make it scale into higher debts now than it did before. That's the only real difference.
The average youth of the 80s and 90s were not better at this IMO. (person that grew up in the 80s and 90s speaking here). There were just less opportunities pushed into your face.
This is true and specifically for sentence in question. It's almost correct already (and I would argue an acceptable use). We'd just colloquially order it the other way round. "Brothers and Sisters". Let's not go down the rabbit hole of potentially why that may be and pretend it's just alphabetical order...
Yeah, I was going to say. I remember the type of sweets existed when I was younger. Just they weren't called sour.
And twitter / x most likely have a similar rule. And musk could have achieved similar by just banning the account.
There's no reason to give it to Jones. He doesn't own any of the applicable ip any more. Maybe there's an argument if he tried that.
Now see, I don't like it but the simplest thing would be for musk to ban the account right now.
He's not circumventing anything then. Ownership of the account was transferred, that twitter, a private entity chose to ban it is their business.
It's not worth arguing about. The website and ip is the juicy thing, being able to make satirical info wars programming and products is where it is at. Maybe, maybe there would be a case if musk allowed Jones to make a new account with the same name or otherwise handed it to him.
No. If valve cancel your steam account, you lose your games and they owe you a big fat zero.
Same goes for all accounts with assets attached.
Sad to say, but in this case it is musk's platform and his rules.
If he wants to go home and take his ball too. Tough luck.
Doesn't seem right, but it is legal and already happened on multiple platforms multiple times.
You know, I think he says a lot of the stupid shit he does, just so his name is constantly in the news cycle. Just like the old 1980s game "A rockstar ate my hamster" the phrase "Any publicity is good publicity" must be his mantra.
Anyway, I agree. I'm sick of hearing about his latest antics.