Both of them are a lot safer than just about everything else, with each having their own strong areas. I think it’s fair to compare the SPARK subset of Ada with the “safe” subset of Rust.
Rust has a great story with memory safety though it’s worth noting that SPARK is adding pointer ownership analysis (inspired heavily by the borrow checker, which really is a big deal). In practice, as others mentioned, heap allocation isn’t needed that much with Ada, and passing by reference without needing pointers is common using “in out”.
Ada’s contracts and strong type system make it easier to design correct programs - is that “safer”? I don’t know. I wish _both_ languages were in wider use. I really enjoy Ada programming but envy the excitement and community being built around Rust. Hopefully Rust’s popularity leads to more enthusiasm for Ada and the unique correctness and safety advantages it brings.
Maybe the next GNAT release will have a “Rust” FFI import mechanism (like it does now for C and Fortran) so Ada code can more easily take advantage of all the great Rust libraries being written.
The “safety” Rust offers is just avoiding C and C++ specific memory access pitfalls, similar to pretty much any mainstream programming language except C and C++. It can also prevent data races, I’ll give it that.
The novelty is doing the above at near C and C++ performance levels, not the protection offered.
Rust is for the move fast and break things mainstream, where such “safety” is good enough.
Ada and SPARK are for safety-critical systems. In other words, it’s playing in a different league.
IMHO preconditions and postconditions are a big win for Ada. I missed them in Rust, so wrote the Rust "assertables" crate, which provides runtime assert expressions for a bunch of typical cases such as comparing numbers, strings, lists, IO, etc.
Rust code to assert a runtime precondition or postcondition a > b then handle it as you wish:
Preconditions and postconditions seem like the wrong way to go about solving the issue they try to solve. They are essentially a secondary type system that tries to express information not expressed by the primary type system in a way that is awkwardly disconnected from the primary type system. You could instead just implement the functionality needed to incorporate that information into the primary type system. In practice this would result in an implementation of dependent types, which is fine.
> implementation of dependent types, which is fine.
Which you have to prove, which is absolutely non-trivial. I guess it could be made into a runtime check as well where static analysis is not possible, but that seems to be a quite hard to reason about language regarding performance.
.NET Code Contracts has both compile time and runtime checks, although it always felt like a bolted on solution even though it is baked into the framework. I guess this is why it didn't carry forward into the .NET Core rewrite.
It is, but that's just Rust's style. The stdlib is deliberately kept small, so that an ecosystem would flourish.
In the design by contract case, there's a lot of experimentation (I linked this https://old.reddit.com/r/rust/comments/17miqiu/is_ada_safer_... in another comment) and while all of them are converging to the same syntax, they all have wildly different implementations.
Perhaps the stdlib could provide a base syntax for contracts, with extensibility APIs so that libraries or compiler plugins or external tooling can determine whether the contract is checked at runtime or at compile time, and by which method. But that demands a lot of design, because once something is in the stdlib and stabilized, it's here forever.
It was only available on enterprise versions, and most likely Microsoft has concluded not enough people were paying for it, nor they wanted to keep it for free, porting them into modern .NET runtime.
I wish more languages supported that natively, starting with C and C++.
The most typical example, in C, output arguments, passed by pointer, a pattern that I also use in C++ because I find passing by non-const reference error-prone since it is not obvious looking at the calling code that an argument may be modified.
But what about NULL? Sometimes it is a good thing that NULL is allowed, sometimes, it doesn't make sense. So again, not obvious. So what to we do? You can document it of course, but the problem is that compilers don't read the docs, and therefore can't tell you if you are doing it wrong. You can play it safe, and design your API so that NULL is always an option, and avoid NULL when using others APIs, but it leads to unnecessary and performance-impacting checks. Preconditions and postconditions would solve that problem: if you pass an argument that can be NULL to a function that doesn't accept NULL, the compiler can warn you. Plus, there is optimization potential, in the same way that C/C++ does with undefined behavior, but this is explicit. You also can also get the choice between safe (runtime checks) or fast (undefined behavior) at compile time.
Making it a core feature of the language rather than an extension (like your rust crate) will help compilers and other tools to take full advantage of what it brings.
Merely because the "assert" wording is more familiar to more Rust developers because it's similar to typical test code, and can create better error messages with details.
For what you're asking, the assertables crate has a macro that will handle any condition:
assert_as_result!(condition) -> Result
And you can alias it as you wish such as:
use assertables::assert_as_result as precondition;
Probably to get better error messages, same as assert_eq, otherwise you need a procedural macro to take appart the entire expression which is way more complicated and more expensive.
I’m glad that domain-limited integer types are called out: I wish more languages had that.
At the same time: there’s a reasonable practical argument that both have essentially safe semantics, and that “safer” one is really just the more successful (easier to integrate into existing codebases, more online resources, &c.) one. Rust is (thus far) winning at that.
That feature really helps when you use integers for indexing, e.g., like an enum. Say you are parsing a file format with a type field which can have values 5, 6, 7, you can easily check if that's valid with a single call and lookup something, like a corresponding length in an array without having to translate those field values to array indices first.
I’ve used Ada. I’ve not used Rust beyond a few hello world type programs. Personally, I like Ada quite a bit, and I really do like Spark. But I think that Ada and Rust are aiming at similar problems in entirely different ways. One of the key design choices in Ada was to emphasize maintainability. Rust’s answer was to emphasize memory safety. While the Ada compiler does quite a few safety checks, these aren’t part of the spec so that will vary by compiler. I am not sure that direct comparisons between the two languages are entirely useful. Ada is used for things like cruise missiles and jets. Rust is used for things like browsers and reimplementing GNU. Both are important, but they aren’t exactly similar.
ada has some things that rust doesn't. a lot of them (bounded numbers, pre/post conditions) are really good but do not present a case for writing the category of program rust targets. nor the other way round, really.
when the dust settles and hindley-milner types, rank-n types, substructural types, etc, all complete their respective journeys from "weird thing the academics came up with" to "state of the art in hip new language that managed to compellingly package it" to "fact of life", it is my hope that dependent types will follow suit -- something that bridges the ux gap between theorem prover and general-purpose language where anything from ownership and borrowing to refinement types lives under one HOL umbrella and lets you make just about any guarantee you like. i recommend anyone reading follows the f* project
Dependent types are actually quite natural in the "compile time evaluated" subset of a programming language like Ada or Rust. For ordinary program code, they run into the obvious issue that dependent typing subverts the usual phase separation between compile and run time, in the absence of an ad-hoc "code extraction" step. Dependent types and contracts also promote "code" to what is effectively the typing level, which introduces thorny API-stability concerns - i.e. the exact way you express a pre/post-condition or invariant in a library API becomes something that downstream users can end up relying on. Apart from that, Rust is already getting there with their advanced const-generics work.
One problem with dependant types is having to write the proofs, but perhaps AI can help with this. The type system can be the world the AI lives in, and we don't have to check the proofs too closely, only know that they proved our high level assertions.
You misunderstand. When working with dependent typed languages sometimes you have to prove things. For example, my function might receive a sorted array, but it has to be proved that the array is sorted, otherwise this is a compile time error--remember, all of this is happening at compile time. So, you write the proof using code and check it in with the rest of your code. It's not something that's going to fail randomly, it's just code.
If AI wrote the proof, this would be no different than having Copilot fill in some code, it's just code you check into the repo.
These proofs often involve writing code, any code, that will pass the type checker. If you can write code, no matter what that code is, and it passes the type checker, the assertion is proved. It's such a well structured problem, "just write code that satisfies these type constraints, any code will do, as long as it type checks", that I think AI might be able to do it quite well.
I've commented on this before and it wasn't well received then either. I think most people have never coded in dependently type language and don't appreciate how often you simply want to fill in the holes and connect the dots with any code that type checks. It's been several years since I experimented with dependent types, so I forgot a lot of the details, but I do remember often wanting to just "fill in the gaps", and indeed, there are IDE tools that can often fill in the gaps for you.
(Another thing I was surprised to realize about languages with strong type systems is how often there is literally only one possible implementation of a function.)
I have seen demos of Idris where just the type signature itself allows for automatic function implementation, which is very cool and almost magical.
And you are probably right that many of the trivial cases can perhaps be guessed by some AI — but I think in these cases a simple, more deterministic algorithm could also find a match, iterating through a few proving strategies (I believe this is how the aforementioned idris demo worked). What might have resulted in your downvote (I didn’t downvote your comment, I think it adds to the discussion!) is that many people are tired of the over-hype of LLMs, and taken your comment blindly as support of that.
With that said, I do think that you overestimate the capabilities of LLMs, as my go to example: they can’t solve even Sudokus. And any other similar, “recursive” thought is simply fundamentally impossible by LLMs in a single step.
It is also the barrier in the popularity of dependently types languages - it is easy to give a proof of List::head. Database::connect on the other hand goes through multiple layers of the whole stack, and function proofs do not compose — two trivial to prove functions composition might be impossible to prove, e.g. I can compose 2 such functions and get some unsolved math problem of your choice.
LLMs can't solve Sudoku, but neural networks plus tree search can solve Go. As you say, traditional techniques, like a simple tree search can already solve a lot of the "holes" that come up in dependent languages. Perhaps something like AlphaGo combined with a LLM (to accept human guidance using language) could work?
The proof of the pudding is in the tasting. I can't find any evidence online that anyone has ever used Rust in software where its correctness needs to be certified, and audited with something like DO-178C. I think that settles the debate. Maybe Rust could be as safe as Ada, once the Rust type system has been proven, and there's a verified compiler. Those things haven't happened yet though, so it's not likely to be used in a safety-critical environment anytime soon. Ada has been used in this area for decades.
You might be interested in https://ferrous-systems.com/ferrocene/, which " is a ISO 26262 (ASIL D) and IEC 61508 (SIL 4) qualified version of the existing open-source compiler, rustc, based on Rust 1.68."
Do any languages besides Ada use an identifier naming convention that merges PascalCase and snake_case like Ada_Identifier_Name? Popular languages like Python, Ruby, and Rust already mix both PascalCase and snake_case (for class and function names, respectively), so consistency isn’t the reason.
This naming convention avoids acronym capitalization ambiguities and reads more like natural written language. For example, instead of normalizing the inconsistently-capitalized identifier XMLHttpRequest as XMLHTTPRequest, XmlHttpRequest, or xml_http_request, you would use
XML_HTTP_Request.
I like it quite a lot while reading, but it's a holy hell to keep reaching between shift and underscore. Besides, what do you call an iterator variable? I? i? Idx? Index? Iterator_Index?
There's a solid argument that memory-managed languages like Java are safer than Rust, because although your program can crash, it can't segfault unless you're doing something especially weird. Safe Rust programs can also crash via `panic!`, and unsafe Rust programs can segfault and create wonky UB just like C programs if the safety invariants aren't manually enforced.
The safest language would be something like Elm or Datalog, where the type system ensures crashes aren't even possible. But then you just make up your own idea of failure, where instead of crashing the program produces unexpected output, almost like UB.
Rust's panic is technically very similar to a Java exception. It does not lead to any unsafety. It throws before executing anything that would be unsafe.
Safe Rust can't segfault, unless you're using unsafe code that is buggy, which isn't that much different from using buggy JNI.
Beyond just memory safety, Rust has a stronger type system than Java — tracks mutability, ownership, and thread safety that Java doesn't.
Besides the obvious reasons this is an invalid comparison I found the Rust bugs returned by your search quite interesting. For some reason that term seems to disproportionally turn up very poor bug reports. It's mostly compiler crashes caused by weird issues on the reporter's machine. For example: "I cannot reproduce this on rust tarball sources outside our Yocto build environment."
Edit: Part of the problem is your search term is `stack overflow` not `"stack overflow"`. Many of your results are issues where the reporter included a stack trace and someone mentioned the word "overflow".
I noticed that many people praising Ada on the internet don't actually know the language.
It's odd. Why advocate for a language you don't use?
Here are some common giveaways:
- "Ada has a garbage collector" - It's optional part of the specification, in practice most compilers don't implement it.
- Confusing SPARK (the formal verifier) with Ada, or even claiming that SPARK is now part of the Ada spec. This is simply not true.
- Not knowing that contracts are enforced at runtime and they're typically disabled in release builds.
- Claiming that Ada is memory safe. Ada doesn't have a borrow checker. Ada doesn't have smart pointers out of the box (unique_ptr is actually unimplementable due to lack of move semantics). Typical non-embedded codebase is full of "new" and "Unchecked_Deallocation", usually wrapped with controlled types, much like C++ RAII.
I am a novice-intermediate Ada user (and I started with SPARK before moving to just Ada), but a lot of the arguments in favor of Rust mention that "there is a crate for that". I would say you need to compare vanilla Rust "out-of-the-box" with Ada for a fair comparison.
Another reason I prefer Ada is because I find it boringly simple like Pascal even though it is verbose, whereas I find Rust very obtuse, and I have a personal bias for array languages and ML languages. I wish Rust were more F#-like.
For the case of "smart pointers", and the larger point of "safety": rust has that included out of the box: Box, Rc, Arc, RefCell are all smart pointers included in the standard library. On safety of the language: the borrow checker exists, move semantics are the default (ie: included in the language) and copy is explicit, and real programs that isolate unsafety effectively (ie: so it doesn't need to be known outside of the module it exists in) are common.
Basically unique_ptr is the default in Rust. I also like to say that Rust is “just” the compile-time enforcement of C++’s smart pointers, which is technically not 100% correct, but gives a good mental image for the uninitiated, I guess.
edit: actually not RefCell. That's more Mutex like. Which is another place rust solves: the tooling for eliminating data races at compile time is mature and pretty complete. The race elimination in Ada appears to rely on using SPARK (as noted elsewhere: not universally used from others experiences), but I'm not too familiar, possible I'm wrong there.
> but a lot of the arguments in favor of Rust mention that "there is a crate for that". I would say you need to compare vanilla Rust "out-of-the-box" with Ada for a fair comparison.
Smart pointers are vanilla rust shrug.
But also, it's pretty fair to mention the availability of crates, because they're easy to integrate, liberally used by the project itself (go browse https://github.com/rust-lang and you'll see a number of projects for libraries distributed as crates rather than part of the stdlib), and there's a large breadth of available packages.
The internet tells me Alire is a (the?) package manager for Ada, its listing has 373 entries right now (https://alire.ada.dev/crates.html).
Simple odds say you're more likely to find a crate for that than an ada package.
>I noticed that many people praising Ada on the internet
Not seen that anywhere, on Reddit or HN. I mean Ada is rarely even mentioned. On HN, Ada sometimes is hated by many. And this is speaking as someone who gets all the fingers for bringing up Ada whenever someone said Rust is the only PL for safety critical applications.
I am actually rather surprised this Submission gets some many people commenting.
And to be honest I think most of those point aren't because they dont know the language, it seems to be more like lacking in context. ( Which is often the case in any online discussions )
Your last point is the largest thing I noticed in the one large codebase I looked at. Ada with a borrowchecker and linear types (aka austral) would be great. Rooting for austral, though I actually prefer obtuse dense syntax to verbose simple syntax. Not sure why. I think I may have a mental cache size related to the number of characters on a screen vs the number of tokens.
I figure the preference in syntax is a trade-off between how hard it is to learn and how hard it is to read day-to-day.
More regular and verbose syntax is easier to learn, but requires more actual reading when you use it.
More concise but complex syntax takes longer to internalize, but then you can see more at a glance, so it lets you do more visual pattern matching and puts less strain on working memory.
C++03 had no move semantics either, it got added in a later version of the standard. Controlled types/RAII is also the basic building blocks of smart pointers, and it's quite normal that they be implemented with unsafe code under the hood.
Well when people say "Ada" they usually mean SPARK not old or feature limited version of Ada.
-Spark do verify contract using static analysis and proof.
-While Ada is not as memory safe as Rust it's a lot more memory safe than popular languages.
-Its one of the easiest languages to get compliance with safety standard DO-178C or DO-333 ...
> -While Ada is not as memory safe as Rust it's a lot more memory safe than popular languages
Popular languages typically have garbage collectors and are memory safe. If a language is not memory safe, like Ada, then it's automatically one of the least memory safe languages, measured by amount of use.
For hard real-time normal Rust (and I guess also Ada) is insufficient. This is such a niche of a sub-niche, that you no longer write the same language as for everything else, most OSs already preclude hard real-time by their very existence, etc.
Soft real-time is absolutely doable with a GC, depending on the required targets. Java’s low-lat GC (ZGC) has <1ms max pause times - you might not make the latest AAA game with it, but for the vast majority of popular games that is more than good enough.
Ada has inbuilt features allowing you to restrict yourself to a safe subset of the language for real-time computing.[0] It can be argued that this is not "normal" Ada since you are restricting the use of the entire language, but the restriction profile itself is a feature of the language.
You have 1ms pauses easily from simply the OS context-switching away from your program, so as I mentioned, unless you explicitly tune your OS, you have this kind of pauses, no matter the language, even if it’s hand-written assembly.
Regarding the non-normal language: with hard real-time you wouldn’t want to allocate willy-nilly and rust doesn’t yet have generics for allocators (correct me if I’m wrong on this, I am not up-to-date here), so careless usage of the standard lib, or other crates is a no-go. For illustrative purposes, this is the same with C++ and `noexcept`.
> Some slippery definitions here
I don’t see the slippery part - it is a slope, though. Depending on what’s the latency requirements and to what percentile should it happen, you can use Rust (and other low-level languages) on the lower end (audio processing), some managed languages in the middle (many kind of games are absolutely fine with that - a single drop of frame is more than fine, hence the soft rt), and pretty much any managed language on the high end.
It depends on what you call "ordinary Rust", I guess. You can use an allocator on embedded, but likely won't want to. There are static types that provide the same APIs as the allocating data structures you'd miss[1]. Granted, you're not gonna be able to take any random crate that isn't designed for such an environment, but that's a different problem, and there's a fledging ecosystem of no_std crates out there. Despite all those caveats, you can still write very high level code in embedded Rust, even async/await state machines![2]
In case others were curious to look up the timeline, WP says first validated GNAT release was in 1995, vs 1983 for the earliest validated implementation.
(but perhaps the earlier stuff wasn't practical: "Early Ada compilers struggled to implement the large, complex language, and both compile-time and run-time performance tended to be slow and tools primitive" ... "By the late 1980s and early 1990s, Ada compilers had improved in performance")
Another issue that delayed adoption in regards to GNAT specially was the licence, somehow debatable, similarly to what happened with Qt, or Java implementations.
It might be because I have "showdead" enabled but I see that there are many frequent comments from this user when clicking their page. Every comment I see from them seems to be flagged/dead though...
There is a lot of value in constraining what the programmer can do. It's not always about getting to minimum viable product as fast as possible, in some fields you actually need to care about reliability and security. We keep finding security vulnerabilities in "battle tested" foundational software - it would be great if we could stop, or at least severely reduce that some day.
The state of software today is more like having several layers of swiss cheese and hoping that the holes don't happen to match anywhere.
> Programming languages are tools to get shit done.
I, on the other hand, find it baffling that some programmers, despite the fact that programmers generally make computers useful for other people, don't care how computers can be more useful for them.
Those guys in the "square wheels comic"[0] are also getting shit done.
> don't care how computers can be more useful for them
My guess is that the OP doesn't see this as a computer being more useful for them, but instead sees it as being less useful as it requires more work to get the same result.
And if you're in a domain where crashes aren't necessarily a problem, or where security isn't a top priority (non-public facing tools often fall into this category), correctness (in the general case) and safety don't automatically map to utility.
> a domain where crashes aren't necessarily a problem
So an application that doesn't deal with user data? Those are rare, but do exist.
> where security isn't a top priority (non-public facing tools often fall into this category)
So nothing accessibly remotely, no connecting to a server for influencing its behavior, no auto-update, no cross-user collaboration. I sometimes hear "this doesn't need safety, it's just a multiplayer game" and don't know how to answer in a constructive way. Even an application like grep can be exploited if an (hypothetical) unmitigated coding mistake exists and it reads a specially crafted file that somehow ended innocuously on your filesystem. The threat model is much bigger than you're giving it credit.
> I roll my eyes every time I hear about yet another programming language for which the biggest invention is that it does not allow null pointers.
> It is even worse when a programming language manages to solve an easily preventable problem by creating additional workload on the programmer.
What programming languages are you referring to here? Both Rust and Ada do more than disallow nulls, and help solve some non-trivial problems. Of course at the expense of additional constraints, but that's probably unavoidable.
Right, and those C programmers talk big but there's no software written in C, where do they get off?
...I really don't understand this comment, there's so much Lisp code out there being used for everything from airfare search to video games, it seems willfully ignorant to say nothing is released in Lisp. I know, lucky 10,000, but still...
No, it is more like having the tools that can technically create better armor does not neccessarily mean you will on average create better armor than someone with tools less capable.
Ok but when people say "is Ada safer than Rust" they obviously mean when both languages are used by competent people who are familiar with the language. So no, it is not like that.
Rust has a great story with memory safety though it’s worth noting that SPARK is adding pointer ownership analysis (inspired heavily by the borrow checker, which really is a big deal). In practice, as others mentioned, heap allocation isn’t needed that much with Ada, and passing by reference without needing pointers is common using “in out”.
Ada’s contracts and strong type system make it easier to design correct programs - is that “safer”? I don’t know. I wish _both_ languages were in wider use. I really enjoy Ada programming but envy the excitement and community being built around Rust. Hopefully Rust’s popularity leads to more enthusiasm for Ada and the unique correctness and safety advantages it brings.
Maybe the next GNAT release will have a “Rust” FFI import mechanism (like it does now for C and Fortran) so Ada code can more easily take advantage of all the great Rust libraries being written.