Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)BA
Posts
1
Comments
301
Joined
2 yr. ago

  • The author also makes some incorrect or misleading claims, specifically about emacs. I acknowledge there's a high bar for entry there and don't personally like emacs, but it's not modal, and it does have the ability to display images and markdown previews.

  • I know several world class programmers, and interestingly, the commonality among them is that they all seem to use Vim as their code editor. Many people I know who think of themselves as world class programmers use Emacs.

    What a burn!

  • It's simply not true that there "aren't really that many definitions of OOP", much less that the guide you've linked is "comprehensive" when it is specifically about Java.

    This is a good, brief post about the different conflicting definitions: https://paulgraham.com/reesoo.html

    This is a much more comprehensive but also less focused overview, with many links, from a site that is effectively both a wiki and a forum: https://wiki.c2.com/?ReesOnObjectOrientedFeatures

  • Go if you want a real mental challenge

    I don't mean to be rude, but I find this baffling; what do you mean by it? One of the primary design goals of Go is to be simple to learn (this is fairly well documented), and it's one of the few things I really have to give the language credit for. Rob Pike has specifically discussed wanting it to be accessible to recent CS graduates who have mostly used Java. I have never heard anyone before describe learning Go as a "challenge."

  • You're misunderstanding the posts you're explaining. Sanitizers, including ASan, HWASan, and bound sanitizer, are not "static analysis tools". They are runtime tools, which is why they have a performance impact. They're not intended to be deployed as part of a final executable.

    I don't know how you can read this sentence and interpret it to mean that they "haven't onboarded AddressSanitizer":

    In previous years our memory bug detection efforts were focused on Address Sanitizer (ASan).

  • If you have a Linux or Mac handy, you can trying it out! It's...kinda wild. If you know some Vim commands that start with :, there's a good chance they'll work in ed, except you don't type : itself (effectively you're always in "command mode").

    There's also a novelty Twitter account, @ed1conf, that tweets about ed.

    Some coworkers told me a story about a previous job candidate who said his preferred editor was ed. They thought it would be really interesting to see someone actually use it. But during the actual interview, when he opened ed, he didn't recognize or understand it; he was actually accustomed to a graphical editor that he thought was called ed because he apparently did all his work on a system where someone had symlinked or aliased ed to a modern tool.

  • One problem with this is that C is in no way the "roots" of programming; it's older than most of the languages we use today, but Fortran, Lisp, and Cobol are all older and are also still in use. (And of course there are other languages that predate C but have mostly fallen out of use, such as Pascal.) It feels "low-level" because it closely reflects the hardware for which it was originally designed, the PDP-7 and later the PDP-11. But in fact it hasn't truly been "low-level" for a long time: I highly recommend the ACM article "C Is Not a Low-level Language; Your computer is not a fast PDP-11."

  • ed, the "standard editor" (according to its man page) and the predecessor of vi (the "visual editor"), is a terminal editor that doesn't automatically display any of the text you're working on; you have to use the p ("print") command to display the lines your wish to see.

  • You are making an extreme assumption, and it also sounds like you've misread what I wrote. The "attempts" I'm talking about are studies (formal and informal) to measure the root causes of bugs, not the C or C++ projects themselves.

    I cited one specific measurement, Daniel Stenberg's analysis of the Curl codebase. Here's a separate post about the testing and static analysis used for Curl.

    Here's a post with a list of other studies. The projects analyzed are:

    • Android (both the full codebase and the Bluetooth & media components)
    • iOS & MacOS
    • Chrome
    • Microsoft (this is probably the most commonly cited one)
    • Firefox
    • Ubuntu Linux

    Do you really think that Google, Apple, Microsoft, Mozilla, and the Ubuntu project "don't even consider onboarding basic static analysis tools" in their C and C++ projects?

    If you're curious about the specifics of how errors slip through anyway, here's a talk from CppCon 2017 about how Facebook, despite copious investment into static analysis, still had "curiously recurring" C++ errors. It's long, but I think it's worthwhile; the most interesting part to me, though, starts around 29:40, where he asks an audience of C++ users whether some specific code compiles, and only about 10% of them get the right answer, one of whom is an editor of the C++ standard.

  • I very much understand thinking that Rust has too much hype, but the differences between C and Rust are so fundamental that "switching between" them just to "keep your interest fresh" seems ill-advised to me. To be honest, your statements about both C and Rust so far seem pretty superficial; have you actually used Rust for anything nontrivial?

    C syntax is simple, yes, but C semantics are not; there have been numerous attempts to quantify what percentage of C and C++ software bugs and/or security vulnerabilities are due to the lack of memory safety in these languages, and although the results have varied widely, the most conservative estimate (this blog post about curl; see the section "C is unsafe and always will be") ended up with an estimate of 40%, or 50% if you only count critical bugs. If I recall correctly, Microsoft did a similar study on one of their projects and declared a rate closer to 70%.

    This means that the choice of language is not just about personal preference. Bugs aren't just extra work for software developers; they affect all users of software, which means they affect pretty much everyone. And, crucially, they're not just annoyances; cyberattacks of various kinds are extremely prevalent and can have a huge impact on people. So if 50% or more of critical software vulnerabilities are due to the choice of language, then that is a very good reason to pick a safer language.

    Rust is not the only choice for memory-safe languages. If you like the simplicity of C, you should definitely learn Go (it's explicitly designed to be as simple as possible to learn). But I would also strongly recommend looking into Zig, which hews much closer to C than Rust does; in fact, it has probably the best interoperability with C of any modern language.

  • Rust's 1.0 release (i.e. the date on which the language received any sort of stability guarantee) was in 2015, and this article was written in 2019. Measuring the pace of feature development of a four-year-old language by its release notes, and comparing against a 50-year-old language by counting bullet points in Wikipedia articles, is absolutely ridiculous.

    Yes, younger languages adopt features more quickly, and Rust was stabilized in a "minimal viable product" state, with many planned features not yet ready for stabilization. So of course the pace of new features in Rust is high compared to older languages. But Wikipedia articles are in no way comparable to release notes as a measure of feature adoption.

    I think C is faster, more powerful, and more elegant.

    "More elegant" is a matter of opinion. But "faster" and "more powerful" should be measurable in some way. I'm not aware of any evidence that C is "faster" than Rust, and in fact this would be extremely surprising since they can both be optimized with LLVM, and several of the features Rust has that C doesn't, such as generics and ubiquitous strict aliasing, tend to improve performance.

    "Powerful" can mean many things, but the most useful meaning I've encountered is essentially "flexibility of application" : that is, a more powerful language can be used in more niches, such as obscure embedded hardware platforms. It's really hard to compete with C in this regard, but that's largely a matter of momentum and historical lock-in: hardware vendors support C because it's currently the lowest common denominator for all hardware and software. There's nothing about Rust the language that makes it inappropriate for hardware vendors to support at a low level. Additionally, GCC is probably the toolchain with the broadest hardware support (even hardware vendors that use a bespoke compiler often do so by forking GCC), and Rust currently has two projects (mrustc and gccrs) working to provide a way to use GCC with Rust. So even the advantage C has in terms of hardware support is narrowing.

    But note that there are also niches for which C is widely considered less appropriate than Rust! The most obvious example is probably use in a front-end web application. Yes, C should in theory be usable on the front-end using emscripten, but Rust has had decent support for compiling to WebAssembly almost as long as it's been stabilized.

  • Well, sure, if the appeal of vim for you is that it "just works" on every platform you use, then there's no advantage to adopting neovim. That's no reason to complain that neovim isn't meeting your needs, though.