SlashData: Rust sees fastest growth, JavaScript still dominates
BatmanAoD @ BatmanAoD @programming.dev Posts 1Comments 288Joined 2 yr. ago
In this context, stabilization refers to the adoption growth curve flattening out.
It's not so much a problem of there being things you "can't do" in other shells or older Bash, as that it breaks existing shell scripts, which is frustrating.
Just because it doesn't matter for most users doesn't mean it isn't a real limitation. I acknowledged as much in my original comment.
You get admin privileges on MacOS like a big boy. You can use bash or zsh commands in Terminal all you want.
Cool. So try updating to a version of Bash from the last 15 years, because the pre-installed one is Bash 3, because Bash 4 and 5 are under the GPLv3 license, which Apple won't comply with.
...ah, no, you can't update the pre-installed Bash, because it's on a section of the file system that is read-only even with admin access. You can install Bash 5 as a separate shell, and use that as your default terminal shell, but any scripts written with the standard #!/bin/bash
instead of the more flexible #!/usr/bin/env bash
will still use Bash 3.
This "handholding" (or really, a safety net) is arguably a good thing, or at least a positive tradeoff; but you can't claim it doesn't exist.
rm - rf
is the only version that makes sense, since the only reason to delete and re-clone is to recover from an unexpected .git/
state, and git rm
won't remove that.
The second one!
The bit in Big Hero 6 with the video records of Tadashi inventing Baymax are about as close to this as I've ever seen in a sci fi action movie.
Sorry, why would you be "boned" if you have UTC time? Are you thinking of the case where the desired behavior is to preserve the local time, rather than the absolute time?
I'm not totally clear on why signals are used here in the first place. Arguably most C code doesn't "need" to use signals in complex ways, either.
The trope will be "old" once the mainstream view is no longer that C-style memory management is "good enough".
That said, this particular vulnerability was primarily due to how signals work, which I understand to be kind of unavoidably terrible in any language.
Indeed, I had no idea there are multiple languages referred to as "APL".
I feel like most people defending C++ resort to "people shouldn't use those features that way". 😅
As far as I can tell, pointer arithmetic was not originally part of PASCAL; it's just included as an extension in many implementations, but not all. Delphi, the most common modern dialect, only has optional pointer arithmetic, and only in certain regions of the code, kind of like unsafe
in Rust. There are also optional bounds checks in many (possibly most) dialects. And in any case, there are other ways in which C is unsafe.
True, but AFAIK they all sucked really bad.
That's pure assumption and, as far as I can tell, not actually true. PASCAL was a strong contender. No language was competitive with handwritten assembly for several decades after C's invention, and there's no fundamental reason why PASCAL couldn't benefit from intense compiler optimizations just as C has.
Here are some papers from before C "won", a more recent article about how PASCAL "lost", and a forum thread about what using PASCAL was actually like. None of them indicate a strong performance advantage for C.
I'm honestly not convinced JavaScript is good even for the front-end, because it's intentionally designed to swallow, ignore, and otherwise minimize errors; which is not helpful at all for development of any software. My point is that the only reason JavaScript is dominant in front-end development is that, prior to WASM, it was literally the only option; if that hadn't been the case, I doubt it would have become nearly so widely used.
C++11 also introduced new problems, such as the strange interaction between brace-initialization and initializer-lists (though that was partially fixed several years later), and the fairly arcane rules around move semantics with minimal compiler support (for example, it would be great if the standard required compilers to emit an error if a moved-from object were accessed).
I know Lisp is minimal, I'm just saying that I expect there are Lisp fans who won't acknowledge (or would excuse) any shortcomings in the language, just as there are C++ fans who do the same for C++.
Sounds like we're actually in agreement about most of this.
I'm okay with languages limiting their "expressive" power in order to provide stronger correctness guarantees or just limit how "weird" code looks; but this is largely because I've worked on many projects where someone had written a heap of difficult-to-understand code, and I doubt such limitations would be appealing if I were working strictly on my own.
I also don't really see the appeal of Java-style inheritance, but to be honest I didn't use Scala for long enough to know whether or not I agree that Scala does inheritance "right".
It does make sense that Rust provides mutability in some cases where Scala doesn't. Rust's superpower, enabled by the borrow checker, is effectively "safe mutability." I hope other, simpler languages build on this invention.
I don't really like the title either, but the article does demonstrate how unfortunate it is that we're effectively locked in to using the ABI at some level of nearly every piece of software.
That said, there definitely were languages with better type systems prior to the invention of C. Pascal is a frequently-cited example.
Sorry, I'm not sure what your point is. I realize that you can almost completely avoid JavaScript, but the point I'm making is merely that there is a real technical limitation that limits the choices developers can make for front-end code, and although WASM is making great strides in breaking down that barrier (something I've been thrilled to see happen, but which is going much more slowly than I had hoped), the limitation is still there. Conversely, such a barrier has never existed on the backend, except in the sense that C limits what all other languages can do.
Ehhh, I mean this more strongly. I've never met people more in denial about language design problems than C++ adherents. (Though admittedly I haven't spent much time talking to Lisp fans about language design.)
The fairly unique thing about the web is that the tech stack is pretty much entirely dependent on what browsers are "winning" at any given time. There are web standards, but Chrome steamrolls them regularly (either by ignoring them or by pressuring the committee to standardize what they want). This is why browser monoculture is bad, and why people recommend Firefox and other non-Chrome (or really, non-WebKit) browsers, as a matter of principle.
So right now, with Chrome's dominance, Mozilla's struggles, and the extremely slow progress of WebAssembly and WASI, it definitely feels like JavaScript will remain dominant for a long while. But since Chrome does support WebAssembly and Google participates in WASI, and since there's no fundamental reason why WASI can't eventually provide everything that JS does today, there's good reason to expect the JS stranglehold not to last forever.
And the great thing about WASM/WASI is that, since it's designed as a compilation target rather than a language, there won't be any remaining reason for a single language (such as Rust) to dominate. Rust got an early lead in WASM because they put the effort into making it a viable target platform for the compiler, and because it's the kind of language that attracts people who dislike JavaScript (such as myself). But there's no reason a different language couldn't rapidly become the most commonly used WASM language if, say, a web framework in a previously-niche language becomes popular. (After all, Rails, a back-end framework, is what popularized Ruby.)
Edit to add: I say "fairly unique", but in fact there's a very analogous situation with C: https://faultlore.com/blah/c-isnt-a-language/