Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)MA
Posts
1
Comments
499
Joined
5 mo. ago

  • Always keep an open mind. I stuck around in my first job until the sad and pathetic end for everyone, and when I finally did start looking the economy was worse than it had been when the writing was first on the wall.

  • I’ve had too many arguments with management about letting them merge and I’m not letting that ruin my code base

    I guess I'm lucky, before here I always had 100% control of the code I was responsible for. Here (last 12 years) we have a big team, but nobody merges to master/main without a review and screwups in the section of the repository I am primarily responsible for have been rare.

    We have a new VP collecting metrics on everyone, including lines of code, number of merge requests, times per day using ai, days per week in the office vs at home

    I have been getting actively recruited - six figures+ - for multiple openings right here in town (not a huge market here, either...) this may be the time...

  • It's not the purpose of LLMs to lower human skills' value, it's just the inevitable outcome.

    Transcriptionist? Industry died with good voice recognition 10-20 years ago.

    Ditch digging shovel crew? Dramatically de-valued with the advent of the steam-shovel...

    and on and on... The theory goes that it gives people more free time, but the way wealth is distributed it is dividing people into those with jobs serving the wealthy and those who live on handouts.

    I think: non-stigmatized "handouts" for everybody are the way of a brighter future. UBI FTW.

  • I've always had problems with junior engineers (self included) going down bad paths, since before there was Google search - let alone AI.

    So far ai overall creates more mess faster.

    Maybe it is moving faster, maybe they do bother the senior engineers less often than they used to, but for throw-away proof of concept and similar stuff, the juniors+AI are getting better than the juniors without senior support used to be... Is that a good direction? No. When the seniors are over-tasked with "Priority 1" deadlines (nothing new) does this mean the juniors can get a little further on their own and some of them learn from their own mistakes? I think so.

    Where I started, it was actually the case that the PhD senior engineers needed help from me fresh out of school - maybe that was a rare circumstance, but the shop was trying to use cutting edge stuff that I knew more about than the seniors. Basically, everything in 1991 was cutting edge and it made the difference between getting something that worked or having nothing if you didn't use it. My mentor was expert in another field, so we were complimentary that way.

    My company (now) wants metrics on a lot of things, but they also understand how meaningless those metrics can be.

    I have to spend more time helping the junior guys out of the holes dug by ai, making it net negative

    https://clip.cafe/monsters-inc-2001/all-right-mr-bile-it/

    Shame. There was a time that people dug out of their own messes, I think you learn more, faster that way. Still, I agree - since 2005 I have spend a lot of time taking piles of Matlab, Fortran, Python that have been developed over years to reach critical mass - add anything else to them and they'll go BOOM - and translating those into commercially salable / maintainable / extensible Qt/C++ apps, and I don't think I ever had one "mentee" through that process who was learning how to follow in my footsteps, the organizations were always just interested in having one thing they could sell, not really a team that could build more like it in the future.

    it’s just another tool.

    Yep.

    If you had to answer how much time autocomplete saved you, could you provide any sort of meaningful answer?

    Speaking of meaningless metrics, how many people ask you for Lines Of Code counts, even today?___

  • Gnome is a good example of something that creates too much of a dependency

    Agreed, I was never happy with GNOME, and starting about 5 years back I have been migrating my systems, personal and professional, off of it. That’s the nature of FOSS, no contracts to negotiate, make the choices that make sense for your use cases and execute them.

    Does Gnome have too much dependency on Gnome: yes or no?

    Absolutely. If you don't mind using Gnome exactly as Gnome wants you to - this year - then it's usually a pretty refined desktop experience, but if I wanted to be told what to like, how to like it, and to shut up and be happy, I'd use a Mac.

    I prefer XFCE for its modularity... don't want a launcher bar? Don't run the launcher; nothing else misses it when it's gone.

    Mess around with Gnome too much and it becomes a nightmare mess of dependencies.

  • AI tools are actually improving at a rate faster than most junior engineers I have worked with, and about 30% of junior engineers I have worked with never really "graduated" to a level that I would trust them to do anything independently, even after 5 years in the job. Those engineers "find their niche" doing something other than engineering with their engineering job titles, and that's great, but don't ever trust them to build you a bridge or whatever it is they seem to have been hired to do.

    Now, as for AI, it's currently as good or "better" than about 40% of brand-new fresh from the BS program software engineers I have worked with. A year ago that number probably would have been 20%. So far it's improving relatively quickly. The question is: will it plateau, or will it improve exponentially?

    Many things in tech seem to have an exponential improvement phase, followed by a plateau. CPU clock speed is a good example of that. Storage density/cost is one that doesn't seem to have hit a plateau yet. Software quality/power is much harder to gauge, but it definitely is still growing more powerful / capable even as it struggles with bloat and vulnerabilities.

    The question I have is: will AI continue to write "human compatible" software, or is it going to start writing code that only AI understands, but people rely on anyway? After all, the code that humans write is incomprehensible to 90%+ of the humans that use it.

  • I have limited AI experience, but so far that's what it means to me as well: helpful in very limited circumstances.

    Mostly, I find it useful for "speaking new languages" - if I try to use AI to "help" with the stuff I have been doing daily for the past 20 years? Yeah, it's just slowing me down.

  • It definitely depends on what you are trying to get out of it.

    I'll grant: low lag audio performance in Windows is... dismal. Which is why everyone had conference call lag adjustment issues in 2020, "go ahead", "no you go ahead", "ok" - both start talking simultaneously again... It seems better these days, I'm sure that's at least in part due to training of the conference participants, but maybe they have been working on getting the lag down without too many dropout / stutters.

    We have a bespoke low lag audio system that was specifically implemented in Linux even though we put the GUI in Windows because of those lag / stutter issues, years back the audio was done on a dedicated DSP chip, but a Core i7 is more than up to the task on Linux these days.

    The Linux audio pains I refer to were: A) audio just doesn't work at all, and B) audio works, until you start to try to use two audio applications simultaneously - then they start to mess each other up. Both of those were better in Windows long before Linux came up to speed. But a lot of how Windows audio gets acceptable performance is big laggy buffers.

  • things before were way worse… why not throw Sticks and stones at those people?

    My earliest memories of Linux audio were in Slackware in the mid 90s, reading and re-reading the HOWTO that started off with a bunch of attitude about how real computer users don't need audio, but we can do it anyway "so, if you must hear Biff bark..." and then a bunch of very unhelpful things to try following that never ever worked on any system I ever tried to use them on. Diverse systems that, of course, all played audio through Windows flawlessly.

  • Linux audio has been a cluster^$%< of epic proportions since the mid 1990s. At least you can make single application systems work well these days, but Windows has really whipped the llamas ass on the audio front for 30+ years now, in terms of "it just works" user experience - without being hyper-draconian on the application ecosystem.

  • I see most often that it's the people who live in init.d - interact with it multiple times a day - who are most vocal about systemd hate. I'm going to call "old dogs don't like new tricks" on that one.

    I do get into that layer of system maintenance, but it's maybe 1-2% of my time, mostly a set-it and forget-it kind of relationship. There was a time when the old ways were easier due to more documentation and guides on the internet, which I lean on heavily because I interact with this stuff so rarely. Those days passed, for me, 8-10 years back.

  • Could be the difference maker in a game someone wants to play on their system.

    One reality of the world is: the developers choose what hardware/OS configurations they target. If the makers of your game don't target your RAM efficient system, you're outta luck. Developers make their choices for their own reasons, but even with the ever-growing FOSS communities, the majority of developers work for a paycheck, that paycheck comes from profitable businesses and those businesses have very strong influence on what the developers work toward. The businesses only exist because they are profitable... FOSS may not be bound by those realities, but it lives in a world dominated by them.

  • 90mb ram

    If you're in a system where 256mb of RAM is the limit, sure - go for the RAM efficient OS options, they're out there.

    Can you even buy less than 2GB of RAM in a desktop system anymore? Even the Raspberry Pi 5 starts at 2GB (and, yes, the older models have less, but I did say desktop system, implying: reasonable desktop performance.) Maybe if you feel the need to use a RasPi 3 as a desktop for something then you should dig around for one of your more efficient OS configurations, but I'll note... back when RasPi 3 was the new model, Raspbian came default without systemd, but offered a systemd option. The systemd option booted to a desktop (such as it was) in about 1/3 the time.

  • So, I don't like the guy either, but for a little devil's advocacy:

    The stuff that already "just works" was developed during a very different era in terms of computing power, tasking of the computers which were running the systems, etc. Nobody (serious, and he is serious) develops something different because "why not?" they, at least from their perspective, feel that they are improving on the status quo, at least for the use cases they are considering.

    one-size-fits-all mentality is

    being decided by the distro maintainers, not the developers. Sure, developers promote their product, but if a distro thinks that multiple flavors are a better path, they distribute multiple flavors. It's not like the systemd developers are filling billion dollar war chests with profit because they're using strong-arm tactics to coerce distro maintainers to adopt their products.

    stuff everything into one bin

    When one bin serves the purpose, it's a lot easier to maintain, modernize, security harden, etc. than ten bins.

    the community and its users will not always be able to freely develop FOSS.

    Fork it and your loyal users will follow.

    Gnome is a good example of something that creates too much of a dependency

    Agreed, I was never happy with GNOME, and starting about 5 years back I have been migrating my systems, personal and professional, off of it. That's the nature of FOSS, no contracts to negotiate, make the choices that make sense for your use cases and execute them.

    FOSS shouldn’t work like that.

    FOSS, by its very nature, should be expected to work all the ways. If a particular way can't get enough developer traction, it stagnates but never really dies, not until the ecosystem it is dependent upon can no longer find hardware to run on and users willing to run it.

    IBM/Red Hat finally decide to seal the deal and lock everyone out for good.

    I am very glad that I walked away from CentOS about 8 years back, its proximity to Red Hat never made me happy. I have been trying to walk away from Canonical (toward Debian) for about 3 years now, but it still has some hooks that keep our professional team happier than Debian. If the unhappy ever outweighs the happy, we'll execute the move.

    Sorry if I can’t rejoice

    Never asked you to. End of devil's advocacy. I still don't like the guy, but I never really interact with him. I do interact with his products and the alternatives, and in my use cases the products speak for themselves. There's nothing about systemd that makes me dig around for systemd free alternatives - they are out there, but for my use cases I don't care. YMMV.