Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)CB
Posts
5
Comments
505
Joined
2 yr. ago

  • Packages often enable the services they install right away.

    That's a problem of the package, not the package manager.

    Generally this fits with Debian's philosophy. But regardless I think it's out-of-scope for why Apt is good. You could make a distro with Apt and not have your packages do this.

    To temporarily install a package [...]

    I'm not talking about apt the CLI tool, but the actual package manager. The plain apt tool is only designed to be a convenience wrapper for common workflows implemented in other tools.

    As you correctly pointed out, Apt has the distinction between packages installed as a dependency ("auto installed") versus packages installed directly ("manually installed"). This is precisely one of the reasons why I consider Apt the best package manager. (Yes, I know other package managers can do this, not all though.)

    If you want to install a package as manual, then later mark it as auto, you can do that with apt-mark.

    dpkg-scanpackages is eternally slow.

    Are you maintaining a PPA for others?

    Frankly, I've never run into this problem.

    The standard packaging tools [...] are insane.

    dh_make helps you create a package that adheres to Debian policy, and there is good reason for Debian to have those policies. But if you're just packaging something yourself, you don't have to use it. It's just a template for new packages.

    At the end of the day, all you really need to create a deb is to create two files debian/control and debian/rules. These are the equivalent to a PKGBUILD. The control file specifies all of the dependency metadata, and the rules file contains the install script.

    The difference in packaging philosophy is that PKGBUILDs are external and they download the upstream sources. On the other hand, in Debian, they rehost the upstream package and add the debian directory. This means that building Debian packages is mostly hermetic: you don't need access to the network.

    What do you like about it?

    Mostly that it makes super useful distinctions between concepts. But there are other goodies.

    • Manually installed versus auto installed.
    • Uninstalled versus purged.
    • Upgrade versus Dist Upgrade.
    • Dependency versus suggestion versus recommendation.
    • The alternatives system.
    • Pinning, and relatedly that packages can include version constraints in their dependencies.
    • Interactive configuration at install time.
    • Support for both source and binary packages.

    I also do appreciate that Debian pre-configures packages to work together with the same set of conventions out of the box. But again, that's a property of the packages, not of Apt.

  • From an engineering perspective, I prefer Debian distros. Apt is the greatest package manager ever built. For a production server, I'd choose Debian or maybe Ubuntu if I needed to pay someone for support.

    But for a desktop, Ubuntu kinda sucks. These days, I think I'd recommend Fedora to Linux noobs.

    And for my toys at home, I run Arch btw.

  • At the end of the day, the query engine is going to operate on some kind of plan object (a tree of operations).

    So it's really just a matter of the language frontends that are offered. Obviously you have to provide a SQL frontend. And since PRQL can compile to SQL (it's all the same relational algebra under the hood) then the easy way is just PRQL -> SQL -> Query Plan.

    I don't think there is any benefit of maintaining a direct PRQL to Query Plan frontend. It's just another frontend that you have to maintain.

    Effectivly, you can just use SQL as an intermediate representation so that you just have to write the PRQL frontend once and have it support all DBs, modulo some backend-specific optimizations per DB.

    It's much more cost effective to maintain a set of optimizations for a shared PRQL frontend than to maintain an entire PRQL frontend that only works for your DB.

    That said, I would love if a DB offered PRQL natively. Even if that just means shipping the open source PRQL frontend as part of the DB.

  • Maybe it's not so poorly written. The ambiguity could be a feature.

    If the manufacturer date can't be proven, you shouldn't be able to sell the gun. So maybe more guns get prohibited in practice that would otherwise be allowed.

    And it forces folks to keep more detailed records going forward.

  • That banner image is Android Automotive, not Android Auto.

    • Android Auto is an application on your phone that displays on the car's infotainment.
    • Android Automotive is when the car's infotainment is running Android directly.
  • Why does Google Slides only get 4/5 for compatibility?

    It literally works on everything. I'm surprised LaTeX scored higher on compatibility, because the install process is heavy (8 GB or more, as you said) and you still have to configure afterwards (e.g. change to XeLaTeX for Unicode support.) IMO, Slides is a 5/5 and LaTeX is a 4/5, for compatibility.

    EDIT: Also Google Slides works offline, and you can install it as a PWA by clicking a button in Chrome.

  • It was not Electron's problem.

    The problem was the extension architecture, that they leaned into heavily. It encouraged basically every part of the system to interact with every other part of the system, like having free reign over the whole DOM. That's what the creators meant by a "hackable" editor.

    VS Code is much faster, largely because of its much more sane extension architecture. Extensions are much better isolated, with a much smaller API surface by which they can interact with the editor. And the LSP design means core IDE-like features can be lifted into a privileged part of the system, and implemented once with performance in mind, while the actual analysis is done asynchronously in subprocesses.

    If you actually use both Atom and VS Code configured to feature parity, you would notice that VS Code is miles ahead of Atom. Microsoft did an amazing job proving that you can build complex performant software on Electron.

    Yes, Electron 2.0.0 was a great update, but it's not the reason for performance. The reason was better software architecture.

  • If personal data is for sale, the NSA should buy it. Like, it's negligent not to, because they need to know what exactly is being sold in these markets. It's not about spying on the people in the data, it's about knowing the market and who has access to what.

    The problem isn't that it was bought, it's that it was sold.

    We need to dismantle the data brokerage industry.

  • “Our demands are reasonable and well within the budget of G/O Media and its private equity owners Great Hill Partners, who made an estimated $44 million in revenue in 2023.”

    How many employees do they have? And what is their current pay?

    I think I'm just surprised that they are referencing a $44M revenue in their statement, because that doesn't seem like a lot.

    Just doing some napkin math. $44M is like:

    • 88 employees at a cost of $500k/employee,
    • 220 employees at a cost of $200k/employee,
    • 440 employees at a cost of $100k/employee.

    And cost is a lot more than just the salary of the employees. It's also things like insurance and 401(k) match. Not to mention things like real estate and IT infrastructure.

    If it's just like 200 employees, then they can probably afford to pay the writers six-figure salaries.

    But at like 500 employees, the financial situation is pretty dicey. That's $88k in salary per employee with nothing left over for anything else. And that isn't a great salary in NYC.

    Google says the count is somewhere between 200 and 500...

    I don't doubt that the writers are not being paid enough for NYC and SF. What I'm more surprised about is the fact that G/O Media only made $44M last year.