Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)JE
Posts
0
Comments
284
Joined
1 yr. ago

  • They're probably not pivoting but in FY2023 Azure made up 38% of their revenue, followed by Office 365 at 23%. That's a lot of cloud service revenue.

    Is it sustainable? Honestly, it might. They sell a lot of stuff under the Azure umbrella and corporations lap that shit up. (Seriously; my employer is about ready to hire consultants to come up with additional eggs they can put in that particular basket.)

    Here's my source; I couldn't be arsed to look it up in MSFT's statements directly.

  • Leopard and Snow Leopard had vastly better virtual desktops than Lion onward. You actually had a grid of them and could navigate up/down/left/right with shortcuts; afterwards you only got a linear list of desktops.

    Gridded desktops were great. I had a 3x3 grid, of which five cells were used. My main desktop was "centered". Thunderbird was right. My IRC and IM clients were left. iTunes was down. I don't remember what was up; it's been a while.

  • My most used features so far are vertical splitters, vertical nudging, and the new placement modes for conveyors and pipes. With an honorable mention going to conveyor wall holes, which also free up a lot of design options.

    Honestly, though, just about everything in this update has been a godsend. Priority splitters are the only thing I haven't really used yet. Even the elevators rock; being able to zoop up to 200 meters up or down in one go can make them useful even as a temporary yardstick for tall structures. (Also, I did end up needing to go 150 meters straight down to get at some resources and can confirm that elevators handle their intended purpose very well.)

  • Do you want a prediction? The current cost of graphic cards will crash the classic PC gaming market. There are some enthusiasts who are buying cards for thousands of dollars or building 4.000€ computers. But the majority of gamers will stay on their laptops or might go for cheaper devices like the SteamDeck. But if your game needs more power, needs a modern graphic card and a beefier PC, there are fewer and fewer people who can run it and many people can’t afford it. So devs will target lower system specs with to reach the bigger audience

    Also, there's not as much value in high-powered GPUs right now because these days high-end graphics often mean Unreal Engine 5. UE5 is excellent for static and slow-moving graphics but has a tendency towards visible artifacts in situations where the picture and especially the camera position changes quickly (especially since it's heavily reliant on TAA). These artifacts are largely independent of how good your GPU is.

    Unlike in previous generations, going for high-end graphics doesn't necessarily mean you get a great visual experience – your games might look like smeary messes no matter what kind of GPU you use because that's how modern engines work. Smeary messes with beautiful lighting, sure, but smeary messes nonetheless.

    My last GPU upgrade was from a Vega 56 to a 4080 (and then an XTX when the 4080 turned out to be a diva) and while the newer cards are nice I wouldn't exactly call them 1000 bucks nice given that most modern games look pretty bad in motion and most older ones did 4K@60 on the Vega already. Given that I jumped three generations forward from a mid-tier product to a fairly high-end one, the actual benefit in terms of gaming was very modest.

    The fact that Nvidia are now selling fancy upscaling and frame interpolation as killer features also doesn't inspire confidence. Picture quality in motion is already compromised; I don't want to pay big money to compromise it even further.

    If someone asked me about what GPU to get I'd tell them to get whatever they can find for a couple hundred bucks because, quite frankly, the performance difference isn't worth the price difference. RT is cool for a couple of days but I wouldn't spend much on it either, not as long as the combination of TAA and upscaling will hide half of the details behind dithered motion trails and time-delayed shadows.

  • Plugins and extensions could make sense if the site and plugin are designed to talk to each other. But that could be made safer by each extension being able to decide whether to announce itself (and the user being able to override that).

  • I got tired of it in 2013. While it does work in some places (Android does it reasonably well), I haven't yet seen a good flat design on the desktop.

    Windows 8 and 10 looked garish and hard to read, especially since everything is a rectangle with a one-pixel outline. Is it a button? Is it a text field? Maybe a thick progress bar? Who knows, they all look extremely similar.

    While Apple did overdo it in the later big-cat OS X releases, I'll take a felt-textured widget panel and a calendar bound in leather over an endless sea of hairline rectangles.

  • Very true. Good coworkers can make work a lot more bearable.

    Looking a bit into the company's business can help, too. If they do something vaguely interesting that can be a bonus. I ignored that once in favor of perks and that got me into the complete disaster area that is fintech. Don't make the same mistake.

  • Das Millionenspiel.

    It's The Running Man except twelve years earlier and a media satire instead of an action movie. It comments on TV phenomena that wouldn't exist in Germany until two decades later (like scripted "reality" TV). Also, it has early appearances of one of Germany's most famous TV hosts (as the show's host, fittingly) and one of Germany's most famous comedians of the 70s to 90s (in a completely serious role, unfittingly). And unlike the Schwarzenegger movie it doesn't construct a dystopian future to introduce public bloodsports but merely gives a terse reference to a "law on active recreation" dated three years after the movie first aired.

    To make it even more odd, it's actually a good movie despite being from Germany and made for TV.

  • AI isn't taking off because it took off in the 60s. Heck, they were even working on neural nets back then. Same as in the 90s when they actually got them to be useful in a production environment.

    We got a deep learning craze in the 2010s and then bolted that onto neural nets to get the current wave of "transformers/diffusion models will solve all problems". They're really just today's LISP machines; expected to take over everything but unlikely to actually succeed.

    Notably, deep learning assumes that better results come from a bigger dataset but we already trained our existing models on the sum total of all of humanity's writings. In fact, current training is hampered by the fact that a substantial amount of all new content is already AI-generated.

    Despite how much the current approach is hyped by the tech companies, I can't see it delivering further substantial improvements by just throwing more data (which doesn't exist) or processing power at the problem.

    We need a systemically different approach and while it seems like there's all the money in the world to fund the necessary research, the same seemed true in the 50s, the 60s, the 80s, the 90s, the 10s... In the end, a new AI winter will come as people realize that the current approach won't live up to their unrealistic expectations. Ten to fifteen years later some new approach will come out of underfunded basic research.

    And it's all just a little bit of history repeating.

  • I remember the early 2000s when basically 90% of all Americans were absolutely certain that jihadists were going to attack their local supermarket any minute now because Power Cable, Nebraska was such a strategic target.

    Heck, there was a bomb scare because of an advertisement campaign for Aqua Teen Hunger Force that involved placing PCBs with LEDs on them that would display characters from the show. Because surely Al Quaeda would put conspicuous LED displays on their bombs.

    News media want people to panic so they keep tuning in. Panicked people tend to come up with remarkably stupid scenarios like "Al Quaeda have unlimited resources and can show up anywhere to shoot people at random" or "Hamas want to take Dorcester as a strategic location to strike at Israel from".

  • Oh, don't get me wrong, I fully agree. Undefined behavior is terrible UX and a huge security risk.

    Undefined behavior was kind of okay when RAM and storage were measured in kilobytes and adding checks for this stuff was noticeably expensive. That time has passed, though, and modern developers have no business thinking like that, even ones working on low-level languages.

    I should've phrased my comment differently.

  • Yeah, that's basically the kind of logic you use when designing a low-level programming language: If we didn't define what happens here then anything that happens is correct behavior and it's up to the user to avoid it.

    Of course applying that logic to a GUI application intended for a comparatively nontechnical audience is utter madness.