Skip Navigation

Posts
1
Comments
1,473
Joined
12 mo. ago

  • In which case would a competent dev use an LLM?

  • Still headpettable tho

  • The old people's drugs to get through the day

  • Are you aware that this exact functionality is built into windows by default, for free?

  • I have it on Android 13, albeit badly patched. Maybe older versions have a critical security bug in the implementation, for which they need longer than the disclosure time to fix or are unable to do at all?

  • 4-7$? So I can basically save a little in the west, and live a stressfree life there? (When ignoring the CCP etc. ofc)

  • in the US is the important part here.

  • "Critical thinking"? What is that? Let me just ask Gemini...

  • ML has been used for eg. medical reasons years before ChatGPT dropped. It's only called "AI" now because that sounds fancier. Consumer ML ("AI") is useless for anything professional - generating a 10 paragraph text filled with emojis and em dashes, and even more factual errors won't solve anything. Continuation of training ML models to eg. detect virus DNA similarities, simulate vaccines, create realistic simulations of weather, climate, vulcans etc., which - again - has been done since more than a decade or so, is actively being done. It's just that it's not as simple as feeding a few million Reddit and Facebook posts into the learning process to get factually correct results in a useful format.

  • To concretize:

    • Upscaling with ML may make the image acceptable if you don't look at anything the devs don't want you. You're supposed to look at yourself, your objective/enemy and maybe a partner. Everything else, especially foliage, hair etc., looks like shit. Flimmery, changing with distance and perspective. Lighting is weird. Thing is, if we aren't supposed to pay attention, we could just go back to HL-level graphics. Even HL 1. However, this would break the aspect of stuff looking good, of you being able to enjoy looking around and not only feeling immersed, but like you are seeing something that you will never actually see in real life - not because it looks unrealistically artificial, but too beautiful, too crazy and too dreamy to be real. However, when I get literally sick by actually looking around, because stuff changes abstractly, not like my brain expects it, that takes away the only actual visual advantage over GoldSrc.
    • Frame Gen does not make sense for literally any group of people:
    • You have 20FPS? Enjoy even worse input lag (because in order to generate frame B between A and C, the generation algorithm actually needs to know frame C, leading to another 1/20 second delay + time to actually generate it) and therefore a nice 80FPS gameplay for your eyes, while your brain throws up because the input feels like =< 10FPS.
      • You have 40FPS and want 80-160FPS? Well, that might be enjoyable to anyone watching, because they only see smoother gameplay, meanwhile you, again, have a worse experience than 40FPS. I can play story games at 40FPS, no problem, but halving the input lag literally more than doubled? Fuck no. I see so many YouTubers being like: "Next we're gonna play the latest and greatest test, Monkey Cock Dong Wu II. And of course it has 50% upscaling and MFG! Look at that smooth gameplay!" - THE ONLY THING THAT'S SMOOTH IS YOUR BRAIN YOU MONKEY, I CAN LITERALLY SEE THE DESYNC BETWEEN YOUR MOUSE MOVEMENTS AND INGAME. WHICH WAS NOT THERE IN OTHER BENCHMARKS, GAMES OR THE DESKTOP. I'M NOT FUCKING BLIND. And, of course, worse foliage than RDR2. Not because of the foliage itself, but the flimmering between.
      • You have 60FPS? And you know that you can actually see a difference between 60 and 144, 240Hz etc.? Well that's wrong, you only notice the difference in input lag. So it's going to be worse than smooth now. Because, again, the input lag get's even worse with FG, and much worse with MFG.
      • You have 80 FPS but are a high FPS gamer needing quick reaction times and stuff? Well yeah, it's only getting worse with FG. Obviously.

    And about 4k gaming ... play old games, with new GPUs. RDR2 works quite good on my 7800XT. More than 60FPS, which is very enough if you aren't a speedrunner or similar. No FSR, of course.

  • Fuck Nestle btw

  • Do you see "blended learning" combining AI, code, and automation as the new normal?

    Could become useful to make monkeys, or you, able to write a hello world program.

  • I'd guess that desktop Linux users are statistically using their PC in summer more than the normal PC user (using windows).

    What? I can definitely say that that's true for literally all of my friends and especially me.

  • Weebs fucking weebs?

  • Probably Mr. Robot, considering the others would be South Park and Rick and Morty. And it taught me ... to not take adderall ig.