Skip Navigation

Posts
5
Comments
2,032
Joined
2 yr. ago

  • That's interesting.

    I will say, I've heard from parsec folks that the Nvidia video encoders are faster than the AMD encoders (I forget who has the better encoder AMD vs Intel).

    I run AMD on Linux, for that, AMD all day everyday ... Nvidia is just historically an absolute mess under Linux. This is true both for gaming and general desktop use.

    I used Nvidia for a while on Linux in laptops and in my desktop and regularly encountered issues with KDE being functional as a desktop (like the plasma panel just ... no longer updating after playing a game, so I'd think it's still 6:30 until it was obvious that it was much later and my clock was stuck). That part of the situation has definitely improved. Now that Nvidia has Wayland support in place, it's a fairly reasonable GPU for the desktop.

    The games I played I never had issues with rendering, but my friend who's used my old 2080 under Linux with more games has seen a lot of weird stuff. In Monster Hunter World he gets crazy white triangles that just flash onto the screen during some fights(not sure if that's the right term?). In the recent Hunt Showdown if his post processing is set to medium he'll get fireflies rendering at the top of his screen and flying a million miles an hour like a bad trip. Turning on DLSS or FSR significantly LOWERS his frame rate (for me it's a significant uplift).

    On Windows, I haven't used AMD in a long time. My brother has a 7000 series AMD card; he had some issues for the first few months he had it. He was getting game crashes with AAA titles like Battlefield more than anything else, but I think he resolved that when he stopped using MSI Afterburner (?) I vaguely recall he had some program that was messing with the fan curve in a bad way and the card was not happy about it. In terms of actual rendering though, I don't think he's had any problems with graphical glitches or performance issues. It was just some of his high end games crashing before he figured out what was messing with the card.

    A friend of mine is also doing AMD under Windows with a much older Vega card and has never had any problems I've heard about; other than we played Hunt Showdown the other day and he's suffering from the "all shadows are black" thing which affects any card that doesn't support DirectX 12.1 (for some reason AMD stopped DirectX support for that card at like exactly 12.0 it seems). In any case, Crytek is going to try and fix that for folks like him (and to be fair to AMD, it also affects some Nvidia cards as well, it just seems more AMD users were attracted for some reason).

    I'd say VR and emulation might be a little outside of the typical workload most people are expecting from an AMD card. A 7700 XT should definitely be rendering more stable frames in general than a 1070 and support some newer features like AV1 and such.

    The VR stuff, I've never done anything with that... But in an ideal world nothing you run should actually be able to crash your system. So, it sounds like there is some kind of bug there assuming you're actually getting a blue screen or the system just hangs and stops painting new frames.

    If things are outright just powering down ... that's more likely to be a PSU issue (though not having enough power can cause all sorts of weird things to happen so that might be something to verify as well/make sure you've got a big enough PSU).

    Reporting the rendering artifacts to the developers/maintainers of the emulator is probably your best path forward on that part.

    There are definitely some market forces at play here as well with AMD just getting fewer bug reports filed against software and against drivers so... Some of the bugs that other Nvidia users championed to be fixed on Windows, you might have to champion and reach out to people to get fixed on AMD (on Linux swap AMD and Nvidia).

  • I think there were some social blunders and connections missed because I got a decent phone later than my peers.

    I got my first basic phone (a phone which barely functioned and regularly crashed doing basic things) at 16 back in 2011(?) when many in my class had gotten a basic phone by 2008. By 2010, pretty much everyone had at least a basic phone, many had smart phones.

    I wouldn't write this off as an irrelevant issue in a world where so much connection is done through phones (even if you personally don't believe you were all that affected). I do think my parents decision to delay giving their shy-ish child living in a rural area a good phone (solely because they didn't have one when they were kids) was a bad decision.

    Actually being able to keep up with people between classes, discuss homework, to have gotten some pretty girls numbers earlier on, etc ... that could've really changed my high school and middle school (or at least jr high) experience for the better.

  • Permanently Deleted

    Jump
  • I think ... this is going to be an uphill battle. If you're in NYC, maybe you've got a shot (simply because there are so many folks around).

    However, you're looking at a minority of a minority probably within a minority of folks that you'd find attractive that are in your age group (unless liking Linux is literally the only thing that makes someone attractive to you).

    I've been off and on dating sites myself for years in the Northeast Ohio area. I've used them since my early twenties and I'm now 29 really only having had one relationship come from them that actually went past a few dates; that unfortunately ended last year ... and she was in the medical field and almost completely uninterested in computers (the outdoors is what we bonded over mostly).

    My advice (speaking openly as someone that ... doesn't love where he ended up): keep an open mind, try and find hobbies that you genuinely like that are more likely to involve women, and just ... focus on meeting people.

    Unfortunately for me, I've found most of my hobbies outside of computers to be pretty unhelpful in meeting women (e.g., one of them is hiking, while plenty of women do it at least occasionally, starting a conversation with a girl who's all alone in the middle of woods or in a group with her friends ... well I've yet to do it, despite being a fairly social person elsewhere these days).

    If you're in college, definitely take advantage of the first few years when you're doing gen-ed classes to meet people outside of any computer science related major ... and maybe consider taking some classes that just are more likely to have women in them as electives. If someone you meet is not interested, take it at face value, maybe keep them around as a friend but move on, leave the "win over the girl that wasn't interested" stuff for the movies (I've never seen it work).

  • It also tends to skew older from what I've seen. Might be great for OP, not so great for someone in their twenties or early thirties.

  • vkBasalt isn't Gamescope though?

  • 9/11 happened on 9/11 because 9-1-1 is a thing... At least that's what I've always heard.

  • By integrating everything into it, it has become a good enough medium of communication for almost everything.

    Except that's not at all what we've done.

    The only reason English dominates is because it's the dominant language of the world super powers following world war II. It's not because of some special design, principle, or properties.

    English isn't just "make up whatever rules and put them wherever", particularly formal English which is what we're talking about in the context of education.

    Really, a better argument against changing the spelling is the classic "standards" xkcd, where now you're just making another dialect of English where they spell words differently again, and now it needs to be adopted, fracturing the language further.

    Language will evolve with or without direction. We have the structure in the form of schools to actually evolve it with direction in the name of making things more consistent and intuitive. We should use it, that's all.

  • The old "why try to do anything because it will never be perfect" argument never holds water.

  • I disagree that it's a fools errand. Misspellings rarely become popular enough to become "proper" because we teach everyone the "proper" spelling and we have spell checkers on our computers that are used for virtually everything.

    There's no method for the people speaking the English language to put pressure on a word that already exists because we've build up this infrastructure to "lock things in' and insist that "they've been this way so they must continue to be this way." The only way we get language evolution currently is via slang ... which is hardly a way to get a better language.

    I know the history of facade, it's like many other words we've stolen from other languages that don't make a lick of sense in our alphabet. It's not an infinite list, it's fixable, but we need to change the mind share that "it has to be this way."

    We made up official spellings, we can fix them, they're not an immutable law of nature.

  • Yeah, don't do that. You're asking for the game to shut off at best. They probably won't have the anticheat configured to outright ban you for that but, as a general rule: do not tamper with multiplayer anti-cheat protected games.

    My recommendation would be to go to their discord and request the option.

    EDIT: I will say this is overly cautious advice, I've never seen Crytek in general or any other developer ban people for using render alternating software that wasn't specifically designed to be a cheat.

  • Might you be thinking of “Sweet Little Sixteen” by Chuck Berry? The guy who btw installed cameras in women’s bathrooms?

    Ah yes, that's the one (oops).

    Love the Beatles, mind you, but uhhhh… all of those boomer bands were like that.

    Yeah, I like several Chuck Berry songs but ... definitely a different take. Those songs would not fly today on the radio.

  • That's something only a teacher would say. As someone who did all their school work and got a fancy engineering job, a lot of it was bogus busy work that 99% of us have completely forgotten.

    You can't tell me that I needed two teachers having me comb through the book for words that weren't part of the index so that I could rewrite the word's textbook definition on a piece of paper verbatim on a weekly basis and that that was a good education experience.

    You can't tell me my high school study hall where they'd give you something to do if you were bored and forbid you from sleeping or playing games unless the study hall monitor "liked you" was a good experience.

    I mean my high school algebra teacher couldn't even remember the algebra lesson she'd taught every year for over a decade when I had her. If it was really a life skill or that important, she would've remembered.

    In calculus they teach you the hard way to differentiate and then they're just like "ah but actually you can do it this way and that's how everyone does it."

    Artificially raising the difficulty by forbidding formula sheets in math is also just stupid. If you can see the problem, recognize which formula to use, and use it, that should be enough.

    We're just straight up wasting millions of hours of people's time with our education system that has very little merit in terms of long term results and retention and negatively affects both people that come out of it "passing with flying colors" and people that flunk out because of various home life circumstances, bad teachers, difficult with the material, or a lack of interest.

    Students are miserable (suicide is at an all time high last I checked and I'm pretty confident it's not just about social media), administrators are miserable, teachers are miserable, and kids really don't learn all that much that stays with them into adulthood. We desperately try to shove way too much information into people's heads in a very dry and uncaptivating way. We need to throw the system out and figure out how to teach what matters and change/replace stuff that doesn't matter or make sense (e.g. we changed the spelling of various words in the past, why don't we fix them instead of teaching a bunch of ridiculous spellings that make no sense like facade, ghost, llama, etc).

  • Yeah, sweet seventeen by Chuck Berry also hits ... as at the very least, creepy in 2024.

  • That's fair, I did just check my Rocky Linux install and it does indeed use LVM.

    So much stuff in this space has moved to hosted/cloud I didn't think about that.

  • Votes already are public to all server admins (I can see exactly what you voted for in communities my instance knows about).

  • Did your friend borrow your Lenny account to shame you for not learning to Google things?

  • I haven't seen LVM in any recent Fedora (very high confidence), Debian (high confidence), or OpenSUSE (fairly confident) installations (just using the default options) on any system that's using GPT partition tables.

    For RAID, I've only ever seen mdadm or ZFS (though I see LVM is an option for doing this as well per the arch wiki). Snapshotting I normally see done either at the file system level with something like rsnapshot, kopia, restic, etc or using a file system that supports snapshots like btrfs or ZFS.

    If you're still using MBR and/or completely disabling EFI via the "legacy boot loader" option or similar, then yeah they will use LVM ... but I wouldn't say that's the norm.

  • Go for it, use a personal anecdote if you have one about how you or someone you know was bit by not putting blocks in front of the wheels.

    If you don't have one... make up a white lie about how your now deceased great uncle Johnny told you he left a trailer out without blocks on a little slope like that and it rolled into the street.

    The anecdotes / a story can be a ... easier path to selling the truth that can feel a bit more humbled vs "I know better than you, so do what I say."