Skip Navigation

Posts
7
Comments
208
Joined
2 yr. ago

  • What does Go’s simplicity have to do with dependency injection?

    In my experience, following Go's philosophy of simple solutions eliminates the need for complex solutions such as dependency injection.

    How do you unit test your extremely complex projects if your business logic carries the additional responsibility of creating objects?

    I write modular code that accepts interfaces so I can test the components I want to test. The vast majority of object creation happens at initialization time, not in the business logic. For the projects I've worked on, that would be true with or without DI - I don't see how that's relevant.

    perhaps your “extremely complex projects” wouldn’t be so extremely complex if you practiced dependency injection?

    When the CTO says, "Make it distributed and sharded," I do what I'm told, but that is an intrinsically complex problem. The complexity is in the overall behavior of the system. If you zoom in to the individual execution units, the business logic is relatively simple. But the behavior of the system as a whole is rather complex, and DI isn't going to change that.

    Edit: I was interpreting "using DI" to mean using a DI framework such as Unity, and I would be happy to never need one of those frameworks ever again.

  • I understand the principles, how branch prediction works, and why optimizing to help out the predictor can help. My question is more of, how often does that actually matter to the average developer? Unless you're a developer on numpy, gonum, cryptography, digital signal processing, etc, how often do you have a hot loop that can be optimized with branchless programming techniques? I think my career has been pretty average in terms of the projects I've worked on and I can't think of a single time I've been in that situation.

    I'm also generally aggravated at what skills the software industry thinks are important. I would not be surprised to hear about branchless programming questions showing up in interviews, but those skills (and algorithm design in general) are irrelevant to 99% of development and 99% of developers in my experience. The skills that actually matter (in my experience) are problem solving, debugging, reading code, and soft skills. And being able to write code of course, but that almost seems secondary.

  • I've heard of Rust. It sounds noisy and even more verbose than Go, which is already a fairly verbose language. I haven't had any reason to learn Rust, so I haven't done so. The error handling is annoying but at this point I don't really notice it any more. And as interolivary said, Go has generics now.

  • I use various strategies depending on what seems appropriate, including the two you mention. I've never felt the lack of DI.

  • How does dependency injection have anything to do with writing tests? I write tests by writing a test function that executes the code I want to test...

  • I've been working primarily in Go for the past five years, including some extremely complex projects, and I have never once wished I had dependency injection. It has been wonderful. I have used dependency injection - previously I worked on a C# project for years, and that used DI - but I adore Go's simplicity and I never want to use anything else (except for JS for UI, via Electron or Wails for desktop).

    Edit: If we're talking about dependency injection in the general sense (separation of concerns, modularization, loose coupling), then yeah I agree that's kind of critical to writing good, maintainable software. When I hear "dependency injection" I think of frameworks such as Unity, and that is what I was specifically talking about - I am very happy with the fact that I have felt zero need to use any framework like that over the last five years.

  • Are there seriously professionals out there who think debuggers are useless? That is utterly baffling to me. Logging and tests are useful, but if something unexpected happens, the debugger is absolutely the first tool I'm reaching for unless I'm dealing with remote code (e.g. on a server) or some other scenario where using a debugger is a pain.

  • PHP was the first language I did any significant coding in. I will never use it again if I can at all avoid it.

  • It seems to me that programming evolves too quickly for this to be a significant occurrence. Granted my dad switched careers away from programming when I was 3, but his experience and mine are radically different. Though the first programming I ever did was on one of his old programmable HP calculators.