Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)BA
Posts
1
Comments
288
Joined
2 yr. ago

  • Slow compared to just chucking everything into a single source file, actually: https://github.com/j-jorge/unity-build

    That's only true for clean builds and even then isn't universally true, and of course there are other reasons not to do unity builds. But the existence of the technique, and the fact that historically it has sped up build times enough for various projects to adopt it, does show that the C++ model, with headers and separate compilation units, has some inherent inefficiency.

  • These are extremely superficial observations. You should learn more about each of these languages before dismissing them; Go is especially easy to learn.

    (I quite dislike Go, actually, but "it has no classes" is nowhere near a valid reason not to learn a language.)

  • Yes, popular programs behave correctly most of the time.

    But "perfectly fine for the last two decades" would imply a far lower rate of CVEs and general reliability than we actually have in modern software.

  • Take a step back and look at the pile of overengineered yet underthought, inefficient, insecure and complicated crap that we call the modern web....

    Think about how many indirections and half-baked abstraction layers are between your code and what actually gets executed.

    Think about that, and then...what, exactly? As a website author, you don't control the browser. You don't control the web standards.

    I'm extremely sympathetic to this way of thinking, because I completely agree. The web is crap, and we shouldn't be complacent about that. But if you are actually in the position of building or maintaining a website (or any other piece of software), then you need to build on what already exists, unless you're in the exceedingly rare position of being able to near-unilaterally make changes to an existing platform (as Google does with Chrome, or Microsoft and Apple do with their OSes) or to throw out a huge amount of standard infrastructure and start as close to "scratch" as possible (e.g. GNU Hurd, Mill Computing, Oxide, Redox OS, etc; note that several of these are hobby projects not yet ready for "serious" use).

  • I hear you, but here's my experience:

    I've had one coworker whose personal coding style actually somewhat resembled that in the Clean Code examples. He wrote functions as small as possible, used many layers of abstraction, and named everything very verbosely and explicitly.

    Now, to be fair, I don't think he did that because of Clean Code, and he also didn't follow most of the good practices that Martin recommends. Most egregiously, he almost never tested things, even manually (!!). He once worked an entire weekend to finish something that I needed for my part of the project, and when he was done, it didn't work, because he hadn't actually run it at any point (!!!!!).

    But even when his software did work, it was horrendous to navigate and modify, specifically because of that style of writing code. I know, because when he retired, I was the only person on the team who could deal with it, so his part of the project fell entirely on me.

    Now, I've also had to work with code that had the opposite problem: short names, no abstraction. And a sort of "worst of both" codebase where the functions were exceedingly long and full of near-duplicate functionality, but overall there was a fair amount of modularity and abstraction.

    But in my opinion, it was much harder to deal with the code that hid all of its weirdness behind layers and layers of abstractions, despite those abstractions being carefully documented and explicitly named.

  • The whole book is like this, though, and these are specifically supposed to be examples of "good" code. The rewritten time class toward the end, a fully rewritten Java module, is a nightmare by the time Martin finishes with it. And I'm pretty sure it has a bug, though I couldn't be bothered to type the whole thing into an editor to test it myself.

  • Because abstractions leak. Heck, abstractions are practically lies most of the time.

    What's the most time-consuming thing in programming? Writing new features? No, that's easy. It's figuring out where a bug is in existing code.

    How do abstractions help with that? Can you tell, from the symptoms, which "level of abstraction" contains the bug? Or do you need to read through all six (or however many) "levels", across multiple modules and functions, to find the error? Far more commonly, it's the latter.

    And, arguably worse, program misbehavior is often due to unexpected interactions between components that appear to work in isolation. This means that there isn't a single "level of abstraction" at which the bug manifests, and also that no amount of unit testing would have prevented the bug.

  • The problem is that most languages with exceptions treat that as the idiomatic error mechanism. So checked exceptions were invented, essentially, to do exactly what you say: add the exception type to the function signature.

    Having separate errors-as-return-values and unwinding-for-emergencies is a much more recent trend (and, IMO, an obviously good development).

  • Yes. True. But Uncle Bob literally complains about non-nullable types in the linked blog post.

    I'm not saying testing isn't important. I'm saying that hand-written unit tests are not the end-all be-all of software quality, and that Uncle Bob explicitly believes they are.

  • Unlikely, unless his view has changed substantially in the last seven years: https://blog.cleancoder.com/uncle-bob/2017/01/11/TheDarkPath.html

    I think his views on how to achieve good quality software are nearly antithetical to the goals of Rust. As expressed in that blog post and in Clean Code, he thinks better discipline, particularly through writing lots and lots of explicit unit tests, is the only path to reliable software. Rust, on the other hand, is very much designed to make the compiler and other tooling bear as much of the burden of correctness as possible.

    (To be clear, I realize you're kidding. But I do think it's important to know just how at odds the TDD philosophy is from the "safe languages" philosophy.)