Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)OK
Posts
0
Comments
457
Joined
2 yr. ago

  • Legitimate military targets as it might feed a potential terrorist. And even if it didn't, not killing it might allow for a surplus of food, in such a way as to benefit terrorists.

    /s, but probably not far from their reasoning, for what is actually state sponsored terrorism.

  • Only losing a week on a major change is a good sign. I wish the people who started the project had that same attitude with regards to clarifying requirements. They also did the opposite of designing a flexible solution. No thought to the actual problem, picking a contrived problem to "tackle". Full on blinders on event driven architecture, split a simple thing into multiple nano-services, yet tightly coupled by sharing the same model which is de/serialized at every step, and then throw in application level filtering on the events... no schemas, no semantic versioning.

  • waiting for solid requirements

    This is exactly the situation. Except that my team consisting of consultants just "started", instead of trying to scope out the constraints and larger picture. I joined a month or so after.

    Six months, and the result so far of their exploration is a fairly uninteresting happy-path use of some technologies, barely related to the task that had unclear requirements. Turns out the work done is unsuited for it. Boggles the mind how much resources are wasted on such things.

    Feels extremely unrewarding to have worked, relatively hard, for half a year, and the fruits of my labour is... getting to the point where the actual problems are solved. Which one could have done from day one, if one had started in a team without wrong preconceptions, or, no team, for that matter.

  • Maybe I'm not aware what I'm missing out on. On GrapheneOS on a Pixel phone, you can install the camera app made by Google. com.google.android.GoogleCamera. Has some fancy features, but maybe not the ones you are referring to?

  • It was the job switch that landed me in that situation. A change from a small company where about 70% was actual productivity, to a large corporation, in a team where there was severe issues with planning and working on the correct problems. So far it's been 6 months of... well, wondering if I'm missing something, or a bigger picture somewhere, to trying to turn the ship in the right direction. If it's still like this in another 6 months, I'll consider a change of scenery.

  • Still doesn't mean "massed" is correct, and it's weird to use here. Never understood why one would go out of one's way to present an incorrect argument.

    Edit: feel free to downvote me. It doesn't make me any less right, or the parent comment any less incorrect. Weird, but hey, there are worse things to be wrong about.

  • It's fascinating how some SPAs come about. Often consultancies who win some bid to implement X features. Since "good user experience" is hard to quantify/specify, it ends up being a horrible end result.

    Zalaris is one such that I'm in complete awe of. Set up user flows that are expected to take 30 minutes to complete. Yet, don't keep track of that state/progress withing your own SPA. Click the wrong tab within that SPA, and state is reset.

    It's, just fascinating.

  • Thanks for the clarifications.

    I do hope it improves. I never understood why Wayland became a thing, if it's fundamentally flawed. But then, on the other hand, it's strange to not make the improvements in X11, unless that too is fundamentally flawed.

  • I don't know about all of those. Not sure if you downvoted me, in which case you might have the predisposition of not giving a shit. In which case I'd be most happy to oblige.

    As for the technical implementations / shortcomings, I... don't really care about it. The reason I didn't use Wayland before was because things didn't work. The reasons why I don't use X11 now, is because things occasionally stop working. The reason why I still sometimes use X11, is that unless I do so, some specific software doesn't work. That's the frame of mind I have, and I don't have any allegiance or vested interest beyond that. You seem to have that, and that's great. Caring about the technical details has my respect.

    So as for the stuff you mention that is directly user-facing:

    • Screen recording used to be a problem, haven't had that issue recently. OBS records my screens and part of it, just fine.
    • Window sharing like you could with X11 with ssh -X is amazing, and doesn't work, but it's been about 15 years since I used it.
    • Crashes that completely freeze my computer. Doesn't happen in Wayland. Happens in X11 (it's not a kernel panic, but whatever it is, I have to reboot, end result is the same to me).
    • Have had no issues with any of the monitors I own.
    • Global hotkeys work, and have always worked, for me. If it didn't, I simply wouldn't use Wayland, as a lot of my workflows surround tools I have built and trigger with global hotkeys.
    • Sleep mode, I don't use. Is that the same as Hibernation?
    • I don't use a single appimage, but I downloaded one to try now, and it worked fine.
    • What is redshift?
    • "Windows can't raise themselves or keep themselves raised", does this mean to request to be in focus? I'm curious which programs benefit from this.
    • sudo is insecure by default in Wayland? How come? I'd be interested to know how it has anything to do with wayland/x11. unless you mean GUI applications executed with sudo, not having access to wayland stuff?
  • I was in the same camp one year ago. I sometimes still use it due to Synergy not working otherwise.

    It's a common occurrence in X11 that I get a full screen "Oops something broke. [Log out]"-screen, except you cannot log out because the screen doesn't register any inputs.

    So, these days: Wayland just works, and X11 (except for some specific software) causes problems. But, I aslo use AMD GPU.

    So, what in particular is not ready with Wayland? I hated it two years ago. Now, I have little reason to.

  • Kojima is the JJ Abrams equivalent in the game industry. Great visual execution, but absolutely horrendous story-telling that will make you wish were dumb as a piece of loaf so as to not notice it.

    When Kojima made a comment that he didn't fully understand the story himself... It sort of all made sense. It's just connotations mashed together, beach, strand, hair, cord... A big pile of nothing to create intrigue with no payout, no mystery to reveal, just more layers of confusion. Sort of like Lost. I'm sure JJ and Kojima would get along great.

    But oh boy are some of those moment exceptionally beautiful and spectacular in all its illogical absurdity. Mads Mikkelsen's acting. Got goosebumps. But then it falls apart by revealing flaws through the fourth wall. Like did... part of this mystery hinge on the double meaning of words? Whatever the fuck was going on, it's a little bit silly for synonyms to play an important part.

    Kojima has a lot of other great tastes. Using music to create moments of excellent cinematography. Motion capture and character designs have always been fantastic. There are moments in Death Stranding that made me have to put down the controller and just. Enjoy. The same goes with Metal Gear games I grew up with. The flower field in MGS3, and forcing you to pull the trigger... The attention to detail on so many gameplay mechanics. It's just brilliant. But, the illogical and meaningless complexity for the story and world building? That part has always been the weakest part and left a bad taste. In MGS it was confusing enough, but it had a certain charm. In DS, puuh, it's rough.

    JJ and Kojima should have nothing to do with writing storylines and plots. Imagine how much brilliant stuff we would be left with? And I never understood why. In JJs case, I suspect it's simply decent return on investment for those who fund the movies. But from a craftsmanship perspective, it's weird. The culmantion of work from hundreds of artists, all masters in their respective fields, and it shows, yet, it comes together to tell a story, surrounding a plot that a 14-year old might put together.

  • XML isn't as common as one would think. It's been steadily decreasing in popularity and use. It's a very verbose format that is suited to enrich a larger set of data, such as HTML documents. For data heavy documents where, it's a particularly bad match, as you end up using as much text for annotation as the data itself.

    Using XML for 3MF is IMHO a technical cop-out, where you don't really want to solve it "correctly", so you go with something that is "good enough". With XML, you know it'll be able to encode anything, be human readable, and have existing parsing libraries in pretty much any programming language and standard libraries. So, it makes sense. However, if you're creating such a format, the least one should do, is write a sibling standard for how to directly binary encode the data. This isn't a hard thing to do. It just need a standard for how to do it, so everyone agrees. Here is an example online on how a rudimentary implementation could be done for OBJ files, but the principle is the same. That way you could chose to export either as 3MF or 3MFB (for binary), and as long as your slicer, and what not, can decode it, you're good.

    The hard part of 3MF was all the great work in standardizing what, and how that is represented.

  • Completely agree about STL, however, I cannot for the life of me understand why 3MF isn't a binary format.

    It has all these big tech companies behind it, and they landed on incredibly short sighted mistake of making the format human readable, instead of providing good tools for reading and modifying the binary format.

    Compressing the human readable content is fine for reducing storage size. But de/serializing the XML is going to be at least 3 orders of magnitude slower. Given a sufficiently large file, the difference would be waiting 30 seconds, vs a barely noticeable 0.3 seconds.

  • Maybe I'm just a little bit too familiar with it, but I don't find LLMs particularly convincing of anything I would call "real AI". But I suppose that entirely depends on what you mean with "real". Their flaws are painfully obvious. I even use ChatGPT 4 in hopes of it being better.