Rust has a lot of great qualities that C++ lacks, but comparing `rustc` to `gcc` or `clang` on move-semantics checking is just kind of silly these days. `rustc` has `clang-tidy` built in.
`clang-tidy` is not letting you mutate or even access that moved-from "suffix" object without throwing an error.
It's annoying that you need `clang-tidy` and ASAN and shit to get comparable runtime safety even in greenfield C++, but that's not why to prefer Rust.
Prefer Rust because of a better traits system and pattern matching and syntax for `Maybe/Either` and a consistent build/library story and a number of other things.
Your point is a fair one, but defaults matter. There's little reason for clang-tidy not to be part of the default clang invocation by now, other than an aversion to producing new output for existing projects (that you could argue are already "broken"). Unless clang-tidy has false positives, in which case the comparison isn't apples to apples then.
Oh I completely agree that defaults matter, and I wouldn't advise anyone to go start some big C++ project unless they had a very good reason to.
But Rust vs. C++ is not an apples-to-apples comparison in this regard: Rust got a clean slate and was willing to cut ties with engineer-millenia of existing high-value software to do it.
In a perfect world Rust wouldn't be ambivalent at best and hostile at worst to C++ interop, c'est la vie.
Rust isn't hostile about C++ interop, it's that between native interop (which requires dealing with templates and memory unsafety) and safety, Rust prioritized safety, while Carbon is trying the alternate approach. The behavior of C++ is simply hard to interface with while providing the assurances Rust gives you.
Edit: how many languages have native C++ interop that supports the whole language? Would love to hear of any.
I said ambivalent to hostile at worst, and I'm sorry, it is. I write FFI to C++ in Rust, Python, and Haskell practically every week, and Rust <-> C++ sucks.
Slap it in an `unsafe` block, fine. But let me move a `std::vector<std::string>` into my `unsafe` block easily. Erase the types, fine. But let me call `v.at(idx)`.
Python takes C++ interop seriously, which is why Tensorflow and PyTorch and all the other people trying to script gigantic, extreme-value C++ codebases use it. Try `pybind11` sometime, it's night and day.
Edit to reply to edit: `pybind11` supports an absurd amount of C++ out of the box with completely natural semantics and a very modest performance penalty. So, Python.
I wouldn't say python takes C++ interop seriously, it's more like the pybind11 people are amazing at what they do and found a way to slice the problem neatly. But yes, it's night and day, pybind11 is a godsend.
Haha I don't know if it matters much whether we praise the `pybind11` folks in particular or the Python community in general or both: they are fucking amazing at what they do. It's such a hard problem that most language communities don't even seriously try, it's a nightmare to get even close on, and they get more than close. Everything just works exactly as you'd expect, with great performance (relatively speaking of course) into the bargain.
Top ten best open source language efforts on Earth. Maybe top five. Just legendary stuff.
I do think that for all its faults though (cough packaging cough), the Python community displays an incredible commitment to getting shit done and helping people solve their problems. It's really a rather mediocre language as these scripty things go, but it just friggin works, which is why everyone uses it for everything.
Oh I meant C++ binding was more of a pain before. I've been doing swig and other kinds of C FFI wrapping for years and this is far better...
Of course it relies on the module/extension API from python that is quite amenable to this kind of full-object and their methods binding. But it works very well together.
So much that we asked similar for Ada https://github.com/AdaCore/Ada-py-bind (Raphael Amiard from AdaCore is quite the Ada hacker) which, even with the limitations of Ada generics, has been a godsend for scripting Ada objects from python.
I have to admit to basically total ignorance of Ada. It's got a heritage around avionics-type stuff right, extreme low defect tolerance settings? Looks kinda like Pascal?
Is it something that a true polyglot hacker needs to know? It's got important, novel ideas or unique applications that aren't addressed well by other tools? I love learning languages, but there are so many you'll never learn them all well.
It's my daily driver and I think one of the reasons my company can build very complex and high integrity systems with little staff. It's got a lot of nifty features and is still being updated to integrate more. I saw so many people very uninterested in hacking and programming, learning Ada and be so efficient with it so fast, with very few bugs, that there must be something there.
I'd say the driving principle is the language tries to have you write code that can be read easier, and is less concerned about helping you write millions lines of code per month. One example (when you get past the knee-jerk 'whoa it uses begin and end instead of braces') is generics where instantiation is mostly explicit (and thus can be painful to write) which helps reading, a lot. I'd downvote to hell any change to that (let the IDE/langserv generate the code for you, but please, generics are complex already).
The other driving principle is finding defects as soon as possible. So the compiler is harsh, there are lots of static checks. But it also helps you find defects at runtime as soon as possible. It can insert validity checks everywhere, if you can pay the runtime cost (hint: it's very rare you can't).
It has strong typing, very interesting typing features, no implicit conversion, most features you'd want in a modern language (tasking, OOP, generics, exceptions - erm...) and some that are more 'original' (contracts, quantifiers, if- and case-expressions, function arguments qualifiers in/out/in-out) and very nice union types (discriminated records).
Recently it has been extended to formal proof, with the SPARK language, which is Ada with some language features disabled and an automated proof environment (based on Why3 and SMT solvers), which is also its own interesting new tech, with interesting advances these last years (proof of floating point operations, memory ownership,...).
I think I could go on and on. But if you can spare one or two hours I'd just go to https://learn.adacore.com/courses/intro-to-ada/index.html and run the tutorial (all in the browser) and read a bit about all the features, and just ask questions downthread, I'll try to monitor it (like I don't already monitor obsessively all my HN comments...).
I have. `cxx` and `autocxx` and `bindgen` and `cbindgen` are, better than nothing I guess? But they're all flakey and have weird corner cases (and crash sometimes! I'm looking at you `cbindgen`!) and don't handle containers well if at all and just, ugh.
I always end up saying fuck it and `extern "C"`-ing everything. It would be completely possible to make these tools work well, but the Rust ethos is "rewrite everything, pure Rust", at least in large parts of the community, and so these projects kind of never get totally dialed in.
> But they're all flakey and have weird corner cases and crash sometimes!
Did you report those crashes? There's also crubit these days, mostly combining the autocxx and (c)bindgen approaches. (It's a very new effort, which is why it's being kept separate from these more established ones.)
I know the talking points (it's impossible to go on the Internet and not know all the Rust talking points) but you have a few beers with some Rust people and pretty quickly you're hearing how anything that isn't Rust all the way down is somehow like, tainted, with the dreaded binary quality of being "unsafe" as opposed to "safe". A serious Rust hacker who shall remain nameless once broke out that old chestnut on me: "A barrel of wine and a spoonful of sewage makes a barrel of sewage." in reference to Rust software that links to C. It's not a coincidence that somewhere, every conceivable library that anyone could ever want is being rewritten in "pure Rust".
And it's a shame because Rust is fucking cool and I want to be using it even more than I already am. But I'm not going to become a Scientologist to get into a party in LA and I'm not going to throw away a mountain of excellent C and C++ because it's, unclean.
I'm optimistic that as Rust continues to have its center of gravity migrate away from strictly OSS and into ever higher stakes industrial settings (which it's clearly making great headway on) that the religious fervor will mellow and it'll become a "getting shit done" language that also happens to be a really cool language!
There's a difference between code that used to compile no longer compiling because of an incorrect lint, and code that was never accepted. Rust is restrictive and gets less so over time. C and C++ need to become more restrictive over time, but that's a more traumatic direction.
> Rust is restrictive and gets less so over time. C and C++ need to become more restrictive over time, but that's a more traumatic direction.
I don't agree at all with your comment, and I find that sort of opinion miopic and not grounded on real-world software projects.
I 've worked on a fair share of legacy projects which were ported to the latest and greatest projects, including a couple of nightmare JavaScript ones. The very first thing we did was onboard onto static code analysis tools and source code formatters.
Once we enabled them we were faced with a big wall of red text dumps. With time that wall shrinked until there was no more red, and from thereon things stuck that way.
There was no trauma, only a kaizen approach to errors being flagged.
The whole C++ world already does this for decades, whether it's for compiler warnings, static code analysis, memory checkers, fuzzers, etc. What exactly leads you to believe this is traumatic?
What you're actually arguing seems to be "why I like Rust more than C++", not arguing why "clang-tidy has false positives, and thus the comparison isn't apples to apples then".
Clearly the positives can be just as false in Rust as in C++. Your actual objection is that anyone arguing that any feature of Clang can measure up to the corresponding feature of Rust at present is automatically disqualified from making that argument because... Clang's past "taints" its present? Like an original sin of sorts, but in programming? ("Apple" forbidden against comparison?)
I don't think it's about being tainted. It's that there's a bunch of C++ code out there already in Production in a ton of places that doesn't pass those checks, and may or may not actually be safe. It would be a ton of work that may not produce any end-user value for any C++ project to switch over to that.
Meanwhile, Rust has always had those checks, so there can't be any Rust code in Production that doesn't pass them that would be painful to switch over.
Sure, but again, that's an argument for why you dislike C++, not an argument for why false positives in that Clang-Tidy check somehow disqualify it from being compared to the corresponding checks in Rust, especially when they both have false positives.
I find it so weird that the Rust community is borderline evangelical about memory safety when a) it's not actually memory safe once you start doing heavy shit b) modern C++ is quite memory safe and c) there are so many other great reasons to like Rust.
Memory safety in serious systems software is something that you approach asymptotically and/or probabilistically. Rust makes it easier to be memory safe in a lot of scenarios, at the cost of the father-knows-best borrow checker, but a crashed program is a crashed program whether I dereferenced a null pointer or was poking around in a slice with multi-byte Unicode characters in it. And that's before you get to `rg unsafe` on your favorite industrial-strength Rust codebase.
Rust is cool for so many great reasons that get talked about so little because everyone seems too busy acting superior about memory safety. Talk about traits! Or Cargo! Or the cool async stuff! Anything but another lecture on memory safety.
> a crashed program is a crashed program whether I dereferenced a null pointer or was poking around in a slice with multi-byte Unicode characters in it
They aren't the same thing though, that's the point.
Dereferencing a NULL pointer isn't guaranteed to crash. In fact if you are writing through the pointer, you may even have a security issue on your hands (rce, etc).
Safe Rust may have runtime errors that "crash" the program but this is a controlled, well-defined termination, and there is no way for the execution state itself to be corrupted like in C++.
Yep, my favorite part of heap/stack corruption is not when it crashes immediately but rather when it rears its head 2-3 weeks/months later when some upstream call pattern or timing has changed.
I've spend weeks chasing down single instances of this on multiple projects. The nasty part is you have no predictability in if it's going to be one that crashes immediately, silently writes garbage(hopefully not to disk!), is a latent lurking crash or security vuln.
If you trash the stack then there's a good chance you lose the backtrace as well which can make a hard to debug issue become "find the needle in the haystack". I hope it's something that reproduces quickly and consistently because otherwise you're in for a ride.
Yeah, this just isn't how it is anymore. The last time I was up shit creek because 50k boxes were crash looping and GDB couldn't get me a stack trace was in 2014. The last time I spent more than 30 minutes chasing a memory corruption issue was in like, 2018. And it was because some wise ass had decided to roll his own fibers by stomping on `rip`, `rbp`, and `rsp`.
These days you use `std::unique_ptr`, build with clang-tidy, CI under ASAN, and it's never an issue in practice. Once in a blue moon the CI chirps an ASAN failure that gives you the entire history of the memory address with line numbers and you fix the typo.
The safety that Rust gives me is that it's more expressive type system and modern affordances for things like exhaustive pattern matching lets me avoid logic errors, which are every bit as deadly as buffer overruns and much harder to mechanically identify.
It is usually easier to write correct code in Rust than in C++ because it's much more modern and frankly kind of an everyman's Haskell (which I mean as a compliment). But it's intellectually dishonest to say that this doesn't come at a cost: when you wander out of the borrow checker's sweet spot it can become kind of a Tetris puzzle even when you know all the rules on paper.
The same pattern matching that lets people see a borrow checker puzzle and immediately say "right, we need to do X" is the pattern matching that let's a C++ hacker see a failed template instantiation and immediately know what got misspelled.
A sibling comment suggests this has more to do with where you work than how modern your C++ is, which rings true to me. Different kinds of programs need different kinds of memory management patterns, and some are more error-prone than others.
In my experience there also tends to be a long tail of memory corruption bugs. After flushing out those that are easy to run into or that have a major impact, everything seems fine and you can go years without really spending time on them, but they're still lurking at the edges of automated crash reports and mysterious bug reports you never quite manage to reproduce yourself. And when I do manage to track one down, it's as likely as not to be in, around, or even caused by modern C++ features.
Tetris puzzle or not, it's really quite nice to systematically rule out those kinds of issues. In some domains it may not be worth it, but in others they can hide major security issues or similar. And either way it sure beats periodically digging through crash dumps trying to piece together something that looks impossible from the surrounding source code.
If you need extreme robustness you have to have coverage and fuzzing and canaries and stuff for logic bugs as well as memory bugs. If you’ve got a long tail of non-exercised code paths, a “<“ flipped with a “>” will fuck up your day just as bad as a use-after-free.
If your code is covered, ASAN will red-zone the memory bug. It checks every address.
People are welcome to their subjective opinions about the “easiest” way to get truly correct software (which almost no one needs), but the oft-repeated assertion/implication that the tools don’t exist to do it outside of Rust/Go is wrong. Not a subjective opinion, demonstrably incorrect.
And when enough truly important shit is written in Rust, which will be soon, there will be CVEs. Many of them.
Well, yeah, if you're reaching for that level of robustness you want every tool you can get. If you can get rid of a whole category of bugs with one tool, that only makes the other tools more effective for the rest!
(There are also cases where that extra robustness is more of a "nice to have," so if you can get a side effect of your approach to something more important, that changes the calculus too.)
Is every diff thoroughly reviewed? Is everything built with `-Wall -Wpedantic -Werror`, `clang-tidy` with most checks on, ASAN/TSAN/MSAN/UBSAN on every commit in the CI, and aggressively canaried against replay data (or whatever is appropriate to the domain to exercise all the paths)? Is all the code run through `clang-format` in a pre-commit hook to lower the cognitive overhead of spotting bugs?
I completely understand that when you turn all the checks up to maximum (which, in fairness, `rustc` does by default) you start with as many errors as you have files if you're lucky, and probably 10x that. I've had to take codebases from working by accident on every 10th line to passing all the static analysis cheaper than PVS-Studio, and it's a bear no doubt. But codebases that are `clang -Werror` clean, `clang-tidy` and `cppcheck` clean, ASAN/MSAN/UBSAN clean, and have all this enforced by CI?
I haven't seen those codebases thrash the core dump where GDB prints out a bunch of "????????????" instead of addresses with any frequency.
Someone should do a 2022-edition "Joel Test" (https://www.joelonsoftware.com/2000/08/09/the-joel-test-12-s...) because I think we're all using revision control now, times change, but until someone does, I'm happy to trade war stories about getting messy code bases / development workflows into fighting form.
That definitely ups the stakes on the "modern" approach a lot (like, in the limit case `#ifdef`-hell to get part way there).
There are firms that will sell you a suite of frontends supporting every C compiler back to the early 80s and integrate them into a modern toolchain, I used to work with an alum of such a firm and I gather it's great stuff. I also gather it costs whatever you can afford, so there's that. I forget the name of the company but I could ping my friend if that's interesting to you.
Worst case, you could not have to chase memory corruptions on the subset of your target platforms that LLVM targets.
> a) it's not actually memory safe once you start doing heavy shit b) modern C++ is quite memory safe
This just doesn't capture the problem that memory safety solves. A crashed program is not the worst-case scenario that it's trying to avoid. Even the most memory-safe language supports exiting early with an error message, or whatever.
In terms of language semantics, there is an all-or-nothing line between memory safety and undefined behavior. A memory safe program does what it says, locally, step-by-step, according to the semantics of the language. When a program exhibits UB, those guarantees are lost.
Of course, as you note, unsafe Rust also lets you violate memory safety, and in fact any memory safe language is at the mercy of its implementation and host. The reason people get evangelical about Rust's memory safety is one level higher: it offers a bridge back to memory safety, such that unsafe code stands on the same footing as the core language. When either are bug-free, the compiler can ensure they are used correctly, using the same type system features for both.
Modern C++ is certainly much less error-prone than the bad old days of manual `new` and `delete`, but it doesn't have an answer to this "unsafe encapsulation." To the contrary, modern C++ actually adds a bunch of new ways to violate memory safety by misusing library APIs. Iterator invalidation, use-after-move, string_view and span and borrowed ranges, by-reference lambda and coroutine captures, etc.
This all means that "serious systems software" in C++ has to approach memory safety via defensive copying or refcounting, copious use of sanitizers, and sandboxed sub-processes. Meanwhile, Rust programs can do things that would be unthinkable in a large C++ codebase, because the assumptions of both the language and unsafe code are encoded in the type system. (For example: https://manishearth.github.io/blog/2015/05/03/where-rust-rea...) It's a qualitatively different solution to the problem.
I think memory safety is the killer feature of rust, and has become so because people see the real world problem it's solving, more than through evangelicalism. We'll see in a few years when more "heavy shit" has been written/rewritten in rust. My prediction is that they will have significantly fewer memory safety issues than comparable c++ "heavy shit".
> Many years later we asked our customers whether they wished
us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to--they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions
would have long been against the law.
-- C.A.R Hoare on his Turing award speech in 1981.
From where I sit the killer feature of Rust is that a bunch of amazingly cool software is written in it, especially in the terminal. I'm a big terminal guy, and I can't think off the top of my head of anything I use constantly that isn't written in Rust. `rg`, `fzf`, `zoxide`, `bat`, `viddy`, the list goes on and on, I fucking love the shit people are writing in Rust.
And I think that should be the killer feature of a language: that cool software is written in it and is continuing to be written in it. This is a killer feature shared by Rust and C++ and these days to be serious about performant software in diverse settings, you pretty much have to know both well.
Terminal programs are one area where Rust's strengths seem to align (very good CLI libraries/parsing, error management, and concurrency) and weaknesses are less relevant (async, GUIs), which might be why it seems to be gaining traction in that area.
Thanks for that list. I'd heard of rg and fzf but not the others.
I immediately thought: well what about Go for command line tools? Is this the viddy you speak of? https://github.com/sachaos/viddy If so, looks like it is written in Go. Looks like fzf too.
> We'll see in a few years when more "heavy shit" has been written/rewritten in rust.
I'd really like to see that. Would be cool to see a completely new Linux user space written in Rust. Not necessarily a rewrite of existing software, new ideas would be great. I tested Linux system calls and they worked very well even though they needed experimental inline assembly functionality to work. With system call support, anything is possible.
>a crashed program is a crashed program whether I dereferenced a null pointer or was poking around in a slice with multi-byte Unicode characters in it
Most of the biggest advances in software engineering are because of increased modularity. One of the best traditional ways to increase modularity is the ability to define and call functions. But any isolation between these "function" modules is only possible if you can at least factor out things into a function mechanically without introducing crashes (for example because of memory unsafety--modularity would fly out of the window right there).
>Rust is cool for so many great reasons that get talked about so little because everyone seems too busy acting superior about memory safety. Talk about traits! Or Cargo! Or the cool async stuff! Anything but another lecture on memory safety.
It's better not to dilute the message. All these other things are nice-to-have gimmicks. But the memory safety is a game-changer. It does no good to advertise 230 features at the same time. No one will remember. Advertise the killer feature. And that's the lifetime stuff, which gives you memory AND THREAD safety.
Rust's thread safety only applies to the special case of those threads accessing in process data segments.
Rust's type system can do very little to help when those threads are accessing the same record on a database without transactions, OS IPC on shared memory, manipulating files without locks, handling hardware signals, handling duplicate RPC calls,...
Yeah but that ultimately requires an unsafe block, kind of true, except no one reads the code of all crates they depend on, and the direct dependencies might be safe in what concerns the direct consumers.
This is a point that you constantly bring up in these threads, do you think most developers believe that data race safety should extend beyond the bounds of the process?
One thing that Rust’s type system does allow you to do is define a consistent manner in which to access external systems, even add types that will mimic the same safety. Is it perfect? Will it protect you from a different process working against the DB? Will it enforce things in the other process? No. But will it give you higher level semantics to be able to construct a better model for operating against that external system? Yes.
This has been a recurring theme from you, but in the cases you're describing the risk is only a race condition (no general solution is possible) and not a data race (which safe Rust is able to deny by design). These are categorically different problems.
People have to consider race conditions anyway, they're part of our world. For example if you use git's ordinary --force to overwrite certain changes that's subject to a TOCTOU race, which is why force-with-lease exists. Even in the real world, I once opened a bedroom window to throw a spider out onto the garden below and a different spider came in through the window in the brief interval while it was open - exploiting the "open window to throw out spider" race opportunity.
Data race isn't just "Oh it's just a race condition" or Rust wouldn't care, data races destroy Sequential Consistency. And humans need sequential consistency to reason about the behaviour of non-trivial programs. So, without this writing concurrent software is a nightmare. Hence, "fearless concurrency".
You won't destroy sequential consistency by having non-transactional SQL queries. Try it.
I have tried plenty of times, and seen not so happy train travelers with the same ticket for the same place on the same train, hence why bring it up all the time.
It is obvious it is a subject that is irrelevant in the Rust community.
Who needs consistency in distributed systems when multiple threads from the same process are accessing the same external data without coordination.
Programmers do. Programmers are human and so can't reason about the behaviour of non-trivial programs without sequential consistency.
If I was trying to debug software which sometimes mistakenly issues people duplicate tickets, I think I'd want to be able to reason about how the software works, and that's not going to be possible if it doesn't even exhibit sequential consistency.
Er, no? Sequential consistency isn't some Rust invention, Leslie Lamport (yes the LaTeX one) invented it for his 1979 paper "How to Make a Multiprocessor Computer That Correctly Executes Multiprocess Programs"
I rather like Lamport's last observation about what happens if you don't have sequential consistency and instead your programs just put up with whatever was cheap/ efficient to implement (as will happen by default on a modern multi-core CPU): "verifying their correctness becomes a monumental task"
It was later proved that it's not merely "monumental" this is actually an undecidable problem in general which explains why humans aren't good at it.
So, this is important in principle to get right, and (safe) Rust does so. You are of course welcome to decide you don't care, why aim to write correct programs anyway? And for now at least it seems in our industry many people agree.
While deadlock is undesirable, this is still behaviour which you can (and the author did) successfully reason about. You still have your consistent model of the world, in which you are deadlocked and can see exactly why.
"Fearless concurrency" isn't "It's impossible to make mistakes" it's only "The mistakes are no scarier than they were in your conventional non-concurrent program".
>Why do people say things like: "It's better not to dilute the message"?
>Better for who?
Better for everyone.
When talking about a new thing, it would be really silly to emphasize how nice the logo is, how nice the package it comes in is, look at the awesome tape the box is closed with etc. If I turn the product off it even turns off! Look at the nice rounded corners of the device!
It even can do async! Just like Javascript and .NET.
Who cares!
What is the main strength of the tool, the pain point it was made to eliminate? Lead with that.
> That's sales/marketing language, not engineering language.
Leading with the actual technical novelty that actually advances the state of the art in production compilers is marketing? Well, I guess it's good marketing in a way.
The user will find cargo on their own in 5 minutes.
I mean "message" how you want, HN is hosted in a free country.
But N=1 for you: as a serious polyglot user of Rust who knows it well and uses it all the time: this shit is a huge turnoff. It's a programming language. On a long enough timeline all the motivated hackers will end up knowing many programming languages well, they all have pros and cons.
Trying to boil important engineering decisions down to a tweet so that we can stay "on message" comes off like something someone would do if they were selling books or training or consulting services attached to a technology, which a priori gives them an agenda other than giving good advice.
So to keep it short: help people pick the right tool for the job without an agenda.
I think you have a point. Rust is primarily focused on being a systems language, and memory safety is the killer feature it brings to the table in that domain. But we know that Rust is being used in areas where its qualities as a systems language are less important.
Why, for example, would a Python developer pick up Rust? Probably because of the really strict typing addressing a major pain point for most Python developers and the trait system being somewhat analogous to Protocols, which any Python developer who has chafed with the dynamic typing is almost certainly already familiar with. With good library support for interfacing between the two, it's a more natural coupling than most people would think on the face of it.
That said, while I don't think a Python developer reaches for Rust because of memory safety, I do think it's still an important factor as it provides the guard rails that make it so someone who has primarily used a GC language and not had to concern themselves as much with managing memory can start using Rust knowing that the compiler is not going to let them accidentally shoot them in the foot when it comes to memory management.
Rust is a programming language. It’s the right tool for the job of programming. It happens to have a lot of features that make it a very good programming language.
I think you’re wrong about the language and community, though. It’s killer feature is it’s safety, be it memory, data race, or type. These are the reasons I was interested in learning the language. The fact that the tools make that easier is why I was able to struggle through the new concepts and actually be able to build useful things with it.
If the fact that people enjoy something as a general community turns you off, that’s not the community’s problem.
rustc rejects the resulting program if you mechanically factor out a function accessing &mut self, into a function holding mutable borrows to half the fields calling another function which access the other half of fields (or vice versa, the caller holding &field calling a method mutating other fields). This requires the more complex transformation of passing individual fields into the subfunction (more work, but sometimes easier to read), or waiting for Rust to add partial borrows. Note I've never actually hit this case myself, though I've heard it's an issue people run into.
Rust is "cool", lately. But it lacks basic features that enable capturing important semantics in libraries.
So the tradeoff is not relative safety against a little compile-time inconvenience. The tradeoff is against "no, you cannot code that thing at all, suck it".
So almost all discussion of relative safety (which Rust advocates would like us to think is absolute) carefully sidesteps the point that there is a very great deal that cannot be expressed in Rust at all -- and not because expressing those things would have been at all unsafe.
This sort of comment is far from fair and misses the whole point.
The thesis of this article is how programming languages compare with regards to safety.
It makes no sense at all to compare particular implementations and try to pass personal assertions on particular features of said implementations as broad assertions about the programming languages they support.
So a particular Rust implementation is bundled with a linter. That's pretty convenient.
It just so happens that the linter is provided by a compiler stack that is also one of the main reference implementations of C++, and said linter also supports C++.
Is it fair to thus claim Rust is somehow superior to C++ just because a particular implementation enables by default it's linter on Rust code but not C++?
> There's little reason for clang-tidy not to be part of the default clang invocation by now
It's perfectly fine that anyone forms their own personal opinion on what defaults a tool should ship with.
That says anything nothing about the programming languages and their safety though.
As is clear from my other comments I also find this stuff unfair.
I try to keep in mind that on balance it’s a good thing that performant programs have become dramatically more accessible recently, and that most of the C/C++/Fortran/Haskell antagonism is a result of enthusiasm around that. For me it was BASIC -> C, but I imagine JS -> Rust is every bit as exhilarating.
But I do hope at least a few people read your comment and are inspired to learn a little background. Rust is cool because it remixed the broadly-accessible FP algebra, a kickass C++ toolchain decades in the making, and a big bet on linear typing.
I didn’t set out to be the jaded, un-hip old guy but here we are :) It’s a nifty new LLVM front end with type classes and the Either monad. Neat. Get off my lawn ;)
They do, that is why any C++ shop where code quality is relevant has a DevOps team that cares about the right defaults being enforced on th CI/CD pipeline.
Devs that don't care only get to build on their own computers.
It isn't fullproof, but definitely helps to tame some cowboys.
To say the rust has a built in linter is wrong. A rust program that does not build because of a memory ownership error on the part of the programmer isn't rejected by the compiler due to a detected pattern, it is rejected because the program is "unsolvable" and cannot be built. I think a lot of people miss how integral the memory semantics Ruct enforces are to how it parses and compiles programs. If these things were simply lints it could be reasoned that a program could be built without following these semantics, that If you could simply go into the compilers source you could just turn them off/remove them. You can't. The way rust's compiler tracks memory is fundamental to how it compiles the binary. It is not simply pattern matching code or ast. Rust's compiler is actually tracking the lifecycle of every bit of memory allocated so it knows when to free it, and it does this at compile time without running the program. These memory semantics errors exist because they are integral. Turning them off would simply result in a broken compiler, or a program with no freeing of memory because the compile time reference counting rust implements becomes impossible.
Objectively speaking, Rust does not "have a lot of great qualities that C++ lacks". Any new feature or improvement has its pluses, but also its minuses.
* Rust has traits, but does not support OOP. Architectures where OOP is particularly effective are proving to be a significant challenge for Rust - GUIs are the obvious one, but also game development Rust projects have to invent new approaches.
* Option/Result make the code flow obvious, but having to return them in nearly all function calls is tedious. Rust has had several attempts at alleviating this problem, but still hasn't matched the convenience of exceptions.
* match is IMO syntactic sugar and its benefits for correctness are being oversold. The fact that one is basically obliged to use it leads to code that's sometimes too deep. Exceptions would cut through this error-handling noise, if they were available.
* The build story is convenient, but this had the unintended effect of encouraging dependency explosion. Adding typical crates results in dozens of transitive dependencies being included in a project.
The notable improvement that Rust brings to the table is machine-verified memory safety with C++-like performance, but that of course comes at the cost of having to adapt code to what the borrow checker understands. If one needs the feature, then the cost is worth paying, if not, Rust is more of a personal choice than an inevitable conclusion.
And finally, perhaps Rust's biggest sin is that it's big, it's complex and there's no end in sight to the complexity spiral, just like for C++. I can only imagine the chagrin of the Rust community as they see themselves competing with Go for many projects where performance is not absolutely critical and often being second choice exactly because of this complexity.
More to the point, Rust lacks key features I need to capture essential semantics into libraries. So, the libraries I could write in Rust would be less powerful than libraries I can write in C++.
Among common uses for these more powerful features is to make misuse of the library into a compile-time error. Coding the library in Rust, if possible at all, would mean failing to prevent these usage errors.
The point here is that C++ puts more power in the hands of a library writer, and both the responsibility and capability to enforce safety in uses of the library. Rust jealously reserves maintaining safety to the compiler alone.
I think you meant thanks to a sane(r) macro system? Both Rust and C++ use monomorphisation for generics, I believe shitty compiler errors are due to C++'s templating.
I’m what way can you have ‘generics’ in C++ that are not based on templating? I am almost certain that any implementation of anything ‘generic’ templates are inherently involved. Maybe I’m wrong about what you mean by generics though.
I feel like a better comparison would be to use std::span (C++20) to mimic Rust's slice. Otherwise you might be tricked into thinking that adding
// hey don't pass in a temporary
auto make_appender(std::vector<int>&& suffix) = delete;
(which turns the provided code into a compiler error under both g++ and clang++) is adequate to prevent the immediate class of issues (namely, C++ allows const Foo& and Foo&& to bind to temporaries).
Meanwhile, it's really really easy to make std::span dangle:
> Nobody would write a make_appender that takes a span argument, because it makes no sense.
I don't agree with that. If you can guarantee that the data pointed to by the span will outlive your appender, then it's safe. And if you don't actually want to transfer ownership or incur the overhead of a copy, and you don't care if your input is a vector or an array, then it's the correct abstraction.
Replace std::span with std::weak_ptr (or a raw pointer), and replace the closure with a class (e.g. a tree where each node has a weak pointer to its parent), and tell me again that nobody would ever write that code. It's fundamentally the same concept: if your ownership model isn't ironclad, or if any of your assumptions are ever violated, then you can run into use-after-free.
"final" in Java is kind of useless, really. A const mechanism like C++ has would go a long way and would be a perfect fit for the OO nature of the language.
Final in Java means the value of a variable, property or parameter will not change after its initial assignment. Values in Java can be either a reference to an object or a primitive such as an integer, double or bool. It's definitely far from useless as it asserts that the reference or value you capture in a closure is not an old version that has been replaced, this approach is a consequence of Java disallowing arbitrary pointers. IMO Java's biggest mistake here is mutability by default, which Kotlin has learned from. If you understand why it is this way it makes a lot of sense and tbh I think it promotes better code. That said I would like to see more immutability in Java and with things like Record classes you can see Java is moving in the right direction.
I agree with that and I use final as much as possible. E.g. instance variables that don't need to change, I almost religiously declare them as "final" and initialize them in the constructor.
What I mean is that what Java really "should" have is const like C++. A C++ function with a prototype of:
int doSomething(std::vector<int> const &x)
tells me much more than the equivalent Java:
int doSomething(final List<Integer> x)
Also C++ member functions can declare themselves as not modifying their "this" instance. E.g. there's no way to write this code in Java:
int X::doSomething(std::vector<int> const &x) const {
...
}
which is an extremely powerful, compile-time checkable description of what we are doing.
It's not that final is useless (probably wrong choice of words there), what it does is okay and it's correct to use it as much as possible, but it's a far cry from the static guarantees afforded by const-correct code.
I also agree that the correct approach is immutability by default, but that ship has sailed, and it's also an orthogonal concern to what I'm saying here.
The author could compile c++ with the sanitizers, i.e. -fsanitize=address,undefined
and make a make_appender function that leverages perfect forwarding...:
There's a solid argument in here but it feels like there must be a better example. Can we think of a function that does something worth doing, in a way that programmers of all these languages would actually use, and that sets a subtle trap for C++ programmers? When I read this article all I see is a useless function that contains a completely obvious trap which, yes, Rust prevents, but also just thinking at all would have prevented.
Another small thing: the C++ in this article looks weird to C++ programmers because it qualified vector with std, but does not qualify move.
Why is this such a common defense for the peculiarities of C++? I see it pop up at least a couple of times, anytime C++ is criticized.
I don't have anything against taking pride in one's skill and craftsmanship, but excusing a tool's failings purely on the basis that one needs more skill to wield it and avoid those failings? I want to have that same level of skill and have my tool multiply my skill's output to the max, not have my skill wasted coaxing my tool to perform correctly.
If the implication is that the tool requiring more skill gives commensurate benefits, fair enough, but not if it's just hand-waving away obvious downsides.
Because the code in question isn't something a C++ dev would write, instead it looks like something someone who doesn't use the language much if at all, decided to use for this pretty silly comparison.
It would be one thing if this required skill, but this example is downright silly. Some other people here have posted more reasonable version, that might actually occur in the real world(like the example with std::span)
Yeah, agreed that a competent C++ dev would not write the code in the example. The charitable interpretation though is that errors of the same type can crop in real codebases, the example is just simplified for the purposes of discussion.
In such examples you’re always primed to look for a mistake. It’s a different situation when you have 100K+ lines of code, a dozen developers, and a deadline.
It’s a difference between “look this is Wally” (duh, of course) and a game of “Where’s Wally?”.
Whenever a CVE is discussed people say that the error is obvious, and no good programmer would write like that. It’s easy in hindsight.
Yea and C++ static analysis tools already warn for the case, so even for new C++ programmers where it might not be entirely obvious, its still easy to catch the error.
The bad C++ code is in the very first line of the "make_appender" definition: capturing the closure's environment by reference is nonsense: It is equivalent to returning a reference to an argument. It is not, then, a closure at all.
Using a correctly-defined make_appender would not, then, produce undefined behavior when you use it, with or without "move".
What the author has done here was to take a too-obviously wrong operation, returning a reference to an argument, but dress it up with syntax that will look less familiar to some readers, and pass it off as insightful.
But using a wrong function and getting wrong results is not surprising.
Piling on more uses after, that give more wrong results, does not reveal anything more.
When you need disingenuous arguments to make your point, it tells us more about your point than about the thing you are trying to make a point about. And, publishing anyway tells us more about you.
Returning that fake closure should evoke a compiler warning, if you turn on warnings.
>The bad C++ code is in the very first line of the "make_appender" definition: capturing the closure's environment by reference is nonsense: It is equivalent to returning a reference to an argument.
If it is so bad, it should (in the sense of how things would be in an ideal world) not compile.
>It is not, then, a closure at all.
It is a closure because all variables are closed-over and there are no free variables in the lambda body anymore. That is the definition of closure.
>But using a wrong function and getting wrong results is not surprising.
In an ideal world, there should be a compilation error. (There is in Rust)
The majority of what's wrong in C++ is that it lets you do nonsensical (even dangerous) things, most of the time without even a warning (and not because it's technically impossible to warn--it just didn't occur to them). It's okay to acknowledge that--it's a product of its time.
>Returning that fake closure should evoke a compiler warning, if you turn on warnings.
That "should" tells me all I need to know. In the end either safety is important, or it isn't. Choose accordingly.
> If it is so bad, it should (in the sense of how things would be in an ideal world) not compile.
Yes, C++ can be a bad solution to a lot of problems, and that's okay. Use rust if you need a machine guarantee for memory safety (or you just like the language), you can use Go if you just don't care about that at all and want the language to take care of it. But you can use C++ for non-critical software that needs to be fast (games come to mind). Rust can be too much of a mental overhead than it's worth for some.
That closure is just not a good example. Nobody would write this, because when writing C++ code you _do_ think about whether you want a reference or not. Sure, a lot of bugs can happen but this is not, in my opinion, one of them.
If you want to prove a point, prove it fairly.
Note: I don't use C++ anymore, and I don't like it very much for other reasons.
> The compiler literally takes all the mental overhead away.
Increasing the strictness of the language has one effect: decreasing the number of potential solutions for a given problem.
If the restrictions are carefully chosen (like they are in rust) this leads to generally safer code. But don't fool yourself, the restrictions don't merely generate new solutions - they reject the ones that don't pass the tests.
A more extreme example is formal theorem provers. Carefully constructed proofs will take a lot of effort, but it will also make you confident that the code does what it needs to do.
The rust borrow checker is just a more restricted theorem prover that doesn't touch the logic, it just deals with the memory side of things. It's indeed very helpful in trying to explain what's wrong (and even suggest fixes), but it doesn't take the programmer overhead away.
In a more complex system rust inevitably makes it harder to come up with solutions. It will reject valid code just because it can't prove it's right, not because it's actually wrong. As a programmer, you're going to come up with such solutions, and while in time you get more used to writing code that rust likes, and rust too gets better at accepting correct solutions, you're going to have to fight the borrow checker sometimes.
I don't have a ton of experience with rust, but I encountered cases where equivalent C++ code would've worked just fine, but I had to change it because rust didn't like it.
Rust is an amazing language, but it definitely doesn't 'take all the mental overhead away'.
I don’t think so. In this analogy you don’t go to the doctor at all, because you feel nothing until a bone cracks. The whole point is based on “but nobody would do that”, so in reality we should see no results of UB and segfaults outside of educational experiments. But the only way to do that is to keep our eyes closed.
If you don't need that at the first slot, then the biggest strength of Rust doesn't apply to your problem (and you would waste time having to track lifetime parameters more than necessary to solve your problem).
There's also another item in that list, and that is performance. As soon as you slightly relax that one, Rust does become massively easier. A well placed .clone() or Arc makes the rest of the code performant enough and easier to understand, which makes the equation of choosing Rust over other alternatives in spaces that aren't necessarily systems programming less problematic.
Big problem in these discussions is what should be prioritized.
To me, with the 20 years of experience I have in 8 programming languages, I'll always include safety. Especially having in mind that it's nearly zero-cost in terms of runtime performance.
So to me not choosing safety is a strong sign that I don't want to work with the people who practice that.
The point missed everywhere is that Rust lacks key language features needed to capture essential semantics in libraries, that C++ provides. To code libraries I want to code, I cannot use Rust. Rust cannot express them.
So, safety, good, fast enough, good, but insufficiently expressive? No, thank you.
can you elaborate on those key missing language features ? You have commented multiple times about that, but haven't seen you giving any concrete example. I'm Genuinely curious.
Where to begin? Operator overloading. Programmed move semantics. Generics argument concepts. Look at the whole list of features C++ got since C++11, and subtract out the few of those Rust had or got.
I propose you start with genuinely concerning problems. Operator overloading is a first-world problem and I've made a good career never depending on it (outside of my active C/C++ years at the start of it).
"Generics argument concepts" says exactly nothing, to me at least. Elaborate?
"Programmed move semantics" is, if I understand you correctly, a flavor / taste thing but I can agree it can be made better and more explicit -- say by not using the `=` operator for it. That I could stand behind. Still, it's only catching you off-guard while you're learning. 50/50 though. It's a concern but IMO not a major one.
And your final sentence betrays bias to C++. Well fine, use that, nobody is forcing you to work with Rust, right?
But if you're willing to bash Rust, please do so with concrete arguments. If you know something negative about it that I don't, I believe I and many others will benefit from informed objective criticism.
Do you have that? Already asked you in another comment and I'm still willing to listen to it.
std::vector<int> suffix{3, 4};
auto append34 = make_appender(suffix); // Version 1: test will work
auto append34 = make_appender({3, 4}); // Version 2: test will fail
I was in a mood to type the examples but I accidentally made Version 1 because I had started with the author's first example. I wondered where the problem was and couldn't find any (neither running nor reading the code) until I noticed the author had changed to Version 2.
The problem is "obvious" in hindsight but one of the problems with C++ is that it makes certain things too implicit/convenient. I get that references are a staple feature in C++ but it's not at all obvious from the calling location that Version 2 is a bug. To see that, you have to navigate to the callee (finding which might be hard enough alone without a very solid IDE), and make sure that it's not capturing the argument by reference or not doing anything unsafe there.
It's often a problem when a seemingly "value" argument is turned into actually a pointer-to argument at the callee. There are other languages that have this too, and I've never liked it.
As someone who doesn't regularly code in C++ but has a solid understanding of the basics, I wonder why C++ ever allowed to have a reference parameter be called with a temporary? To me it feels like "References have value syntax but pointer semantics BUT you should program like it had value semantics"? Which to me would be exactly a premature optimization that is looking for trouble.
Again, this isn't a right/wrong thing. Rust moves by default (and a lot of people find "=" a weird pun for that), C++ copies by default and has rvalue-references and an explicit `std::move`.
If you want copy in your C++ lambda, you start it `[=](...) {...}`, if you want take a pointer in your C++ lambda, you start it `[&](...) {...}`, if you want something trickier you do trickier stuff.
Rust opts you in to the nitpicky static analyzer and you have to opt out with `unsafe`, in C++ you have to opt in with e.g. `clang-tidy` or some annotations.
They are remarkably similar, just with different defaults.
> They are remarkably similar, just with different defaults.
I'm going to lead with this, because I think it's most important: Culturally there's a world of difference. Safety is a part of Rust's culture. "Culture eats strategy for breakfast".
Take sorting. In C++ the default sort is unstable, while in Rust the default sort is stable, that's just those defaults you mentioned (each has both kinds), although the choice speaks to culture. But look closer, in C++ the sort has undefined behaviour if your type isn't totally ordered. In Rust you can't sort the partially ordered types without saying how to order them fully. Still, in both languages we can write a custom order, so what happens then? In C++ if your custom order is nonsense you get... undefined behaviour. In Rust sorting won't necessarily work with a nonsense custom order but the behaviour remains well defined.
> Rust opts you in to the nitpicky static analyzer and you have to opt out with `unsafe`
Unsafe gives you a small number of dangerous "super powers" needed to write efficient low-level code, it does not opt out of the borrow checker's analysis, or indeed most other checks. This misconception makes me wonder how much of what you've written is conjecture rather than practical experience.
The sibling says that C++ has the wrong defaults, full stop.
Well in Rust the default `HashMap` uses a cryptographic hash, and you see it everywhere, it's the de facto community "default". In C++ the community "default" is `absl::flat_hash_map`/`folly::F14`, which use SIMD to compare a whole stripe of key-prefixes simultaneously.
I want different defaults for different programs, but the idea that it's esoteric to ever want an associative container within arms reach that fucking demolishes the other one is, ugh, God I want to like Rust even more than I do but this "we're right and everyone else hasn't seen the light" routine is infuriating and pushes me at least away.
My parent comment is trying to emphasize that this isn't a right/wrong thing, different tools for different jobs. And people are just like: "nope, everything but Rust is wrong".
I like and use both C++ and Rust. I also have plenty of bones to pick with both languages.
However, I’ve never gotten the sense that Rust itself promotes the idea that “everything except for Rust is wrong.” I also don’t read much on the Internet these days, and I’m not doing so, probably avoid much of the hype that people are pushing about Rust.
Since it has been established as Hot New Thing, there are huge social incentives tied up in promoting it.
Why do you think that a language with better details would prevent you from opting in to whatever non-default behavior you need?
Having the "right" defaults is better for everyone. Folks who don't know or care get a good, safe default with no undefined behavior or unexpected danger, and folks who know better can opt into something that fits their needs explicitly.
Hashbrown (the Swiss Tables implementation) replaced the previous HashMap implementation in July 2019.
The port is a little older, 2018. The idea was famously explained at CppCon 2017, I don't know whether Google had published on Swiss Tables before that year.
The (default) hash for Rust's HashMap and HashSet is a SipHash. People shouldn't call this a "cryptographic hash" or a "crypto hash" - that's misleading as it would lead you to think of algorithms like the SHA-2 family - but this is literally a cryptographic algorithm just one with very specific properties suitable for this task.
Such algorithms are crucial to avoid being subject to a Denial of Service attack which is, in fact, a security problem. Of course under the C++ "blame the programmer" philosophy you don't deserve protection from Denial of Service unless you knew you needed that and figured out how to ask for it properly.
Just as with the sort functions this is about safe defaults, not about constraining people who know what they're doing. Dropping in FNV instead of SipHash, or even using the identity function as a "hash", is not difficult if you are sure that's what you need.
I clearly spoke out of turn when I mouthed off about Rust's table not defaulting to a Swiss design, and I thank you for straightening me out.
But to the degree that C++ has an RTFM vibe, and I really don't think you'll hear Andrei or Meyers or Sutter talking that way much, it's uniformly applied and not particularly partisan. In my experience C++ pros would rather be writing Haskell and that's where you get all these over-templated "header only" libraries.
Rust is in a glass house on "blame the programmer" stuff, because it's "blame the programmer for being dumb enough to not be using Rust exclusively".
The memory safety catechism makes sense for my SSH client, or web browser, or web server, or shell, or another few dozen security critical things. And frankly I'd feel safer if someone did a ground-up reimplementation of `bash` in Rust, I'd use it in a heartbeat. Tailscale writes their shit in Go for a reason, you don't want even the possibility of a use-after-free in your VPN.
But most of the software I run? It runs as me, and if someone is running as me behind the firewall, I'm in deep shit already.
`rustup` has `curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh` on the fucking home page, ditto vim/emacs extensions, etc. I'm calling bullshit on the every damned thing needs to be DoS or timing-attack hardened. It's marketing, and my original point is that all the other great modern features of Rust make a much better list of talking points.
> I'm calling bullshit on the every damned thing needs to be DoS or timing-attack hardened
But that isn't the claim. Rust's defaults are safe. Remember Rust's one line description "A language empowering everyone to build reliable and efficient software".
This is like with the decision that Rust's sort() is a stable sort. I know what a stable sort is, and so do you, so if we care we may decide it's appropriate to use the unstable sort which could be faster. But programmers who don't know what a stable sort is aren't expected to learn about it before their sort does what they expected.
Same here, I know that SipHash is slower than Fowler–Noll–Vo, which in turn is slower than the identity function, and I know why it would or would not be OK to choose them, and presumably you do too. So if we care we may choose a different hasher for our HashMap. But programmers who don't know about hash algorithms aren't expected to go learn all this stuff before using HashMap.
I think maybe C++ isn't programming it's actually a live action "Um, actually" game where the stakes are your program arbitrarily misbehaves unless you correctly guessed all the things wrong with whatever code you just wrote despite the compiler insisting there's nothing wrong with it as written.
Could I do OK at that game? I'd like to think so. Do I want to play? No thanks.
const doesn't make a difference in this case. It's about the passed-by-reference object being destroyed after the function returns. That is because the object was passed as a temporary.
The fact that you may know that make_appender definition being bad does not mean every C++ user knows as well. It's also impossible for you to know ALL the possible bad, UB leading C++ code out there.
I think the point the author tries to make is that, while C++ and Rust are probably "the same" for the most skillful and disciplined programmers (such as you), for average human, Rust just catches way more errors they make for them early. An extreme analogy would be trying to climb Everest all by yourself vs with a professional guide.
Not to disagree necessarily, but if you (give me a little rope) bucket languages that are hard to get past the compiler but more often correct when you do (Rust, Haskell), and languages where it's pretty easy to get something past the compiler and tweak it until it works well enough for your purposes (JS, C/C++), the tweak-it-until-it-kinda-works languages are fucking killing it on adoption.
I dunno man. I've done C, C++, JavaScript and TypeScript professionally for significant chunks of my career, and the trend that I've observed has overwhelmingly been towards stricter compilers. For example in the front-end world, TypeScript has absolutely exploded in adoption. Everyone could be still using JavaScript, but companies from startups to huge corporates have explicitly decided they want compile type safety -- often in "only" front-end code.
I suspect that the adoption of Rust as a system language is going slower because that's just the natural pace of embedded/systems development, not because of anything intrinsic to Rust. There are now 30k+ lines of Rust code [1] in the Linux kernel. C++ can't claim that.
Oh I think we probably see largely eye-to-eye. My weapon of choice when there are no other constraints is Haskell, and one reason I really like Rust is that I can get a lot of the Haskell features I like in a highly-performant setting. Most of the C++ I maintain these days is in whole or in part generated by Haskell. And if I have to write something fast by hand and it doesn't need to link to stuff I need, I reach for Rust generally.
I was also unaware that Rust has such significant penetration in the Linux kernel, and that's a place where I can see it really shining.
My first comment in this thread was something to the effect that Rust has tons of great stuff to offer, and that the memory safety argument is actually weaker than people think and probably not the only thing people should talk about.
The resulting gang-tackle is just one more data point that the community is still too small and evangelical for me to want to get involved past my proprietary Rust stuff.
Ha, Rust community is very energetic, but IMHO they largely put that energy to good use!
I’m using it for a new project and honestly I’m using it more for the modern tooling and easy C interop than the safety features, but I’m a fan overall. Think it’s a really good language.
> Most of the C++ I maintain these days is in whole or in part generated by Haskell
Can you expand on that? I'm currently researching something similar but lower level & lisp instead of Haskell. It would help to see some existing examples to figure out if it's worth it or not.
Ideally we'll just be able to open source it soon!
Basically we have a nice Haskell DSL for generating arbitrary C++, and we deal with lots of code you wouldn't want to write by hand (big nested switch statements and other kinds of state-machine logic, choices about loop unrolling, lots of template overloads, SIMD intrinsics that require immediate values, etc. etc.) so we write Haskell that generates C++ and feeds it to e.g. `clang`.
Some of this is directly in Haskell, and some of it is little compilers mostly done using Megaparsec. It's a really nice approach where it fits!
> it's pretty easy to get something past the compiler and tweak it until it works
That's just as true of Rust if you use clone(), Rc<> and RefCell<>. You just have to familiarize with a few boilerplate patterns, and the best part is you're only trading off a modicum of performance while preserving safety. But Rust can work quite well as a language for exploratory programming.
> the tweak-it-until-it-kinda-works languages are fucking killing it on adoption.
Industry clearly prioritises speed of delivery above all else. Security, reliability and maintenance are future problems (that would be nice to have).
However, in order to gain the power to reason about our code and be able to prove the correctness of various properties (what a typechecker does), we have to program with more restrictive models. This has been argued many times before (e.g. Dijkstra's structured programming). Haskell and Rust are just two of many examples. My favourite is regular expressions, choose actual proper regexes and you have guaranteed O(n) execution, choose Perl/Python "Regex" and you have a potential security hole.
JS and C++ are killing it on adoption because they had near monopolies for an extended period of time in their respective areas (browser, native higher level language). They are popular despite their obvious (some in hindsight) shortcomings due to lack of alternatives in the same categories.
You can add [[clang::lifetimebound]] on the suffix parameter and you get
<source>:21:35: warning: temporary whose address is used as value of local variable 'append34' will be destroyed at the end of the full-expression [-Wdangling]
auto append34 = make_appender({3, 4});
A function returning a value that depends on the lifetime of the function's parameter is not crazy at all. Every class getter method that returns a reference to a member of the class does this.
Sure. But returning the address of a stack-allocated object is (usually) broken.
There isn't a right and wrong here: sometimes you want to opt in to the check for that, sometimes you want to opt out of it.
Sometimes I actually do want to fuck with addresses on the stack in weird, potentially architecture-dependent ways, it's rare but it happens.
I happen to think that Rust's linear/affine typing is by far the most usable low/zero-cost memory management model that anyone has demonstrated at scale and a real achievement in practical computer science, but it comes at a pretty serious cost in `Box`-this and `Arc`-that and `Rc`-other-thing and generally the borrow-checker being a PITA about some stuff we're used to doing.
Rust is very cool and I use it, but the "using C/C++ is fucking strangers without protection"-vibe got old years ago.
I was trying to be a little more diplomatic in my sibling reply because I've locked horns with the Rust community before and not enjoyed it, but you're not wrong.
It's possible the author actually made this mistake, and because c++ is C++ they didn't realize why/how. Not a lot of programmers have a good handle on c++, maybe even most don't... Hence these other languages.
If the author deliberately chose to compile with warnings turned off, in order to present an example that would crash, then that tells us more about the author than about the point.
Totally not the point of the article, and totally subjective I know, but to me the thing that jumps out is how much more readable the Go code is than the others.
This is the main reason that I use Go as my main language, and why many orgs are starting to adopt it: it's easy to read and jump into. I would argue, however, that it's very easy to create antipatterns and just general spaghetti code with Go. A language that's easy to be productive with != one that's also easy to maintain. Design and philosophy becomes very important with large codebases in the language.
source: consultant, seen some truly heinous Go monoliths.
There are definitely some ways to do bad designs in go, but i have the feeling it will be more immediately apparent what's wrong (or at least what part of the system needs rework). The reason being that there are no ways to obfsucate an awful design by wrapping it on mountains of generics programming and language sugar, making the whole thing a lot worse.
It's only my gut feeling, but does that match your experience ?
Definitely subjective. I don't find some parts very readable. E.g., this line took my a minute to parse:
append34 := func() func([]int) []int {
If I was going to rank the readability I would say:
1. Rust function bodies
2. Go code
3. Rust function signatures
4. C++ code
Which you could argue is me shifting the boundaries a bit, but sufficiently statically typed languages seem to develop two (or more) sublanguages. Global complexity definitely pushes Rust down peg.
I agree. However, my gut reaction was that the style of the go code was written differently than what I’d expect if asked to work off the rust version.
> Can you mention the difference that made Go code is more readable than C++ and Rust?
Probably not, and to be fair anyone else's preference is equally valid. I think largely what individuals consider most readable depends less on what's being read and assessed now, and more on what route a developer has taken to reach this point in their career.
If I'm honest it's more about unconscious familiarity with idioms and constructs than it is an isolated unbiased opinion.
And I'd probably also change my comment slightly to say that by "Go code" I really did mean the actual code doing the work; I find the tests far less appealing.
`clang-tidy` is not letting you mutate or even access that moved-from "suffix" object without throwing an error.
It's annoying that you need `clang-tidy` and ASAN and shit to get comparable runtime safety even in greenfield C++, but that's not why to prefer Rust.
Prefer Rust because of a better traits system and pattern matching and syntax for `Maybe/Either` and a consistent build/library story and a number of other things.