I broadly agree with the author’s point there, but disagree with the specific language he used. In my view, engineering includes those pesky non-technical considerations, like the business context and the human factors, which bring their own tradeoffs and priorities to the engineering decision-making.
That is, his “pure engineers” are not really doing engineering, at least under my understanding of the term, whereas (some of) the impure engineers actually are! :)
(Good) Abstraction is there to hide complexity. I don't think it's controversial to say that software has become extremely complex. You need to support more spoken languages, more backends, more complex devices, etc.
The most complex thing to support is peoples' resumes. If carpenters were incentivized like software devs are, we'd quickly start seeing multi-story garden sheds in reinforced concrete because every carpenters dream job at Bunkers Inc. pays 10x more.
I suspect what comprises the majority of the ai bubble is closer to the subprime mortgage / housing bubble than the dotcom one. More of the money and fallout is tied to real estate and debt deals being all topsy turvy.
There are a bunch of companies that amount to a good prompt + tools, and there is certainly an amount of dotcom feeling there, but the money side of those seems tiny by comparison to the infra investments at the top of the tech food chain
also, hot take: Kotlin simply does not need this many tools for refactoring, thanks in part to the first-class FP support. in fact, almost every non-Android Kotlin dev I have ever met would be totally happy with analysis and refactoring levels on par with Rust Analyzer.
but even with LSP, I would still need IDEA (at least Community) to perform Java -> Kotlin migration and smooth Java interoperability.
Not just them. Anyone being slightly critical of vaccines, Russiagate, etc. Anyone warning about building this censorship apparatus. To paraphrase "Man for All Seasons," they crushed every law to get to the devil.
Now the Devil has turned, and there are no laws to protect them from it.
It removes a class of security vulnerabilities, modulo any unsound unsafe (in compiler, std/core and added dependency).
In practice you see several orders of magnitude fewer segfaults (like in Google Android CVE). You can compare Deno and Bun issue trackers for segfaults to see it in action.
As mentioned a billion times, seatbelts don't prevent death, but they do reduce the likelihood of dying in a traffic accident. Unsafe isn't a magic bullet, but it's a decent caliber round.
> Our historical data for C and C++ shows a density of closer to 1,000 memory safety vulnerabilities per MLOC. Our Rust code is currently tracking at a density orders of magnitude lower: a more than 1000x reduction.
I'm not sure this opt-in/out "philosophical razor" is as sharp as one would like it to be. I think "optionality" alone oversimplifies and a person trying to adopt that rule for taxonomy would just have a really hard time and that might be telling us something.
For example, in Nim, at the compiler CLI tool level, there is opt-in/opt-out via the `--mm=whatever` flag, but, at the syntax level, Nim has both `ref T` and `ptr T` on equal syntactic footing . But then in the stdlib, `ref` types (really things derived from `seq[T]`) are used much more (since it's so convenient). Meanwhile, runtimes are often deployment properties. If every Linux distro had their libc link against -lgc for Boehm, people might say "C is a GC'd language on Linux". Minimal CRTs vary across userspaces and OS kernel/userspace deployment.. "What you can rely on/assume", I suspect the thrust behind "optionality", just varies with context.
Similar binding vagueness between properties (good, bad, ugly) of a language's '"main" compiler' and a 'language itself' and 'its "std"lib' and "common" runtimes/usage happen all the time (e.g. "object-oriented", often diluted by the vagueness of "oriented"). That doesn't even bring in "use by common dependencies" which is an almost independent axis/dimension and starts to relate to coordination problems of "What should even be in a 'std'-lib or any lib, anyway?".
I suspect this rule is trying to make the adjective "GC'd" do more work in an absolute sense than it realistically can given the diversity of PLangs (sometimes not so visible considering only workaday corporate PLangs). It's not always easy to define things!
> I think "optionality" alone oversimplifies and a person trying to adopt that rule for taxonomy would just have a really hard time and that might be telling us something.
I think optionality is what gives that definition weight.
Think of it this way. You come to a project like a game engine, but you find it's written in some language and discover for your usage you need no/minimal GC. How hard is it to minimize or remove GC. Assume that changing build flags will also cause problems elsewhere due to behavior change.
> Similar binding vagueness between properties (good, bad, ugly) of a language's '"main" compiler' and a 'language itself' and 'its "std"lib' and "common" runtimes/usage happen all the time (e.g. "object-oriented", often diluted by the vagueness of "oriented")
Vagueness is the intrinsic quality of human language. You can't escape it.
The logic is fuzzy, but going around saying stuff like "Rust is a GC language" because it has an optional, rarely used Arc/Rc, is just off the charts level of wrong.
You can add your own custom GC in C — you can add your own custom anything to any language; its all just 1s and 0s at the end of the day — but it is not a feature provided by the language out of the box like in Rust. Not the same token at all. This is very different.
...as part of the language. Hence it being a GC language.
Is this another case of "Rustacians" randomly renaming things? There was that whole debacle where sum types bizarrely became enums, even though enums already had an established, different meaning, with all the sad talking past everyone else that followed. This is starting to look like that again.
Which part? It's not available in no-std without alloc crate. You can write your own Arc.
Most crates don't have to use Arc/Rc.
> Is this another case of "Rustacians" randomly renaming things?
No. This is a case of someone not having enough experience with Rust. Saying Rust is a GC language is like claiming Pascal is object oriented language because they share some surface similarities.
> It's not available in no-std without alloc crate.
no-std disables features. It does not remove them from existence. Rust's worldly presence continues to have GC even if you choose to disable it for your particular project.
> This is a case of someone not having enough experience with Rust.
Nah. Even rust-lang.org still confuses sum types and enums to this very day. How much more experienced with Rust can you get than someone versed enough in the language to write comprehensive, official documentation? This thought of yours doesn't work.
> Saying Rust is a GC language is like claiming Pascal is object oriented language because they share some surface similarities.
What surface similarity does Pascal have to OO? It only has static dispatch. You've clearly not thought that one through.
Turbo Pascal has dynamic dispatch. Perhaps you've confused different languages because they happen to share similar names? That is at least starting to gain some surface similarity to OO. But message passing, of course, runs even deeper than just dynamic dispatch.
Your idea is not well conceived. Turbo Pascal having something that starts to show some very surface-level similarity to OO, but still a long way from being the real deal, isn't the same as Rust actually having GC. It is not a case of Rust having something that sort of looks kind of like GC. It literally has GC.
> no-std disables features. It does not remove them from existence.
It's the other way around; standard library adds features. Because Rust features are designed to be additive.
Look into it. `std` library is nothing more than Rust-lang provided `core`, `alloc` and `os` crates.
> Nah.
You don't seem to know how features work, how std is made or how often RC is encountered in the wild. It's hard to argue when you don't know language you are discusing.
> Even rust-lang.org still confuses sum types and enums to this very day.
Rust lang is the starting point for new Rust programmers; why in the heck would they start philosophizing about a bikesheddy naming edge case?
That's like opening your car manual to see history and debates on what types of motors preceded your own, while you're trying to get the damn thing running again.
> What surface similarity does Pascal have to OO?
Dot operator as in (dot in `struct.method`). The guy I was arguing with unironically told me that any language using the dot operator is OO. Because the dot operator is a sign of accessing an object or a struct.
Much like you, he had very inflexible thoughts on what makes or does not make something OO; it reminds me so much of you saying C++ is a GC-language.
> Your idea is not well conceived.
My idea is to capture the colloquial meaning of GC-language. The original connotation is to capture languages like C#, Java, JS, etc. That comes with a (more or less) non-removable tracing garbage collector. In practice, what this term means is
- How hard is it to remove and/or not rely on GC? Defaults matter a lot.
- How heavy is the garbage collection GC? Is it just RC or ARC?
- How much of the ecosystem depends on GC?
And finally, how many people are likely to agree with it? I don't care if my name is closest to frequency of red, if no one else agrees.
I can't say I've heard of a commonly used definition of "GC language" that includes C++ and excludes C. If anything, my impression is that both C and C++ are usually held up as exemplars of non-GC languages.
C++ didn't add GC until relatively recently, to be fair. When people from 30 years ago get an idea stuck in their head they don't usually ever change their understanding even as life continues to march forward. This isn't limited to software. If you look around you'll regularly find people repeating all kinds of things that were true in the past even though things have changed. And fair enough. There is only so much time in the day. You can't possibly keep up to date on everything.
The thing is that the usual comparisons I'm thinking of generally focused on how much the languages in question rely on GC for practical use. C++11 didn't really move the needle much, if at all, in that respect compared to the typical languages on the other side of said comparisons.
Perhaps I happen to have been around different discussions than you?
> focused on how much the languages in question rely on GC for practical use.
That's quite nebulous. It should be quantified. But, while we wait for that, if we assume by that metric C++ is not a GC language today, but tomorrow C++ developers all collectively decide that all heap allocations are to depend on std::shared_ptr, then it must become a GC language.
But the language hasn't changed in any way. How can an attribute of the language change without any changes?
Perhaps, but I'm reluctant to speak more definitively since I don't consider myself an authority/expert in the field.
> But the language hasn't changed in any way. How can an attribute of the language change without any changes?
The reason I put in "for practical use" is because since pedantically speaking no language actually requires GC - you "just" need to provision enough hardware (see: HFT firms (ab)use of Java by disabling the GC and resetting programs/machines at the end of the day). That's not relevant for basically everyone, though, since practically speaking you usually want to bound resource use, and some languages rely on a GC to do that.
I guess "general" or "normal" might have been a better word than "practical" in that case. I didn't intend to claim that how programmers use a language affects whether it should be considered a GC language or not.
Pure engineers deliver perfect and fast software somewhere along the Black Hole Era. Not quite heat death of the universe, but almost there.
Impure engineers deliver "working" code in a deadline, for an arbitrary definition of working. Basically, The Worse is Better™.
reply