Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Is Ada safer than Rust? (reddit.com)
166 points by ajdude on Dec 2, 2023 | hide | past | favorite | 112 comments


Both of them are a lot safer than just about everything else, with each having their own strong areas. I think it’s fair to compare the SPARK subset of Ada with the “safe” subset of Rust.

Rust has a great story with memory safety though it’s worth noting that SPARK is adding pointer ownership analysis (inspired heavily by the borrow checker, which really is a big deal). In practice, as others mentioned, heap allocation isn’t needed that much with Ada, and passing by reference without needing pointers is common using “in out”.

Ada’s contracts and strong type system make it easier to design correct programs - is that “safer”? I don’t know. I wish _both_ languages were in wider use. I really enjoy Ada programming but envy the excitement and community being built around Rust. Hopefully Rust’s popularity leads to more enthusiasm for Ada and the unique correctness and safety advantages it brings.

Maybe the next GNAT release will have a “Rust” FFI import mechanism (like it does now for C and Fortran) so Ada code can more easily take advantage of all the great Rust libraries being written.


The “safety” Rust offers is just avoiding C and C++ specific memory access pitfalls, similar to pretty much any mainstream programming language except C and C++. It can also prevent data races, I’ll give it that.

The novelty is doing the above at near C and C++ performance levels, not the protection offered.

Rust is for the move fast and break things mainstream, where such “safety” is good enough.

Ada and SPARK are for safety-critical systems. In other words, it’s playing in a different league.


IMHO preconditions and postconditions are a big win for Ada. I missed them in Rust, so wrote the Rust "assertables" crate, which provides runtime assert expressions for a bunch of typical cases such as comparing numbers, strings, lists, IO, etc.

Rust code to assert a runtime precondition or postcondition a > b then handle it as you wish:

    assert_gt_as_result!(a, b) -> Result
https://crates.io/crates/assertables


Preconditions and postconditions seem like the wrong way to go about solving the issue they try to solve. They are essentially a secondary type system that tries to express information not expressed by the primary type system in a way that is awkwardly disconnected from the primary type system. You could instead just implement the functionality needed to incorporate that information into the primary type system. In practice this would result in an implementation of dependent types, which is fine.


> implementation of dependent types, which is fine.

Which you have to prove, which is absolutely non-trivial. I guess it could be made into a runtime check as well where static analysis is not possible, but that seems to be a quite hard to reason about language regarding performance.


.NET Code Contracts has both compile time and runtime checks, although it always felt like a bolted on solution even though it is baked into the framework. I guess this is why it didn't carry forward into the .NET Core rewrite.


Betrand Meyer is coming for you... ;-)

He's not, he would respect your opinion, but design by contract was on of the key foundations of Eiffel, and I have a soft spot for Eiffel.


Maybe you don't have to encode everything in the type system?

Also: Maybe not everything goes in the compiler?



And I can't edit but, Rust can statically check contracts too

https://old.reddit.com/r/rust/comments/17miqiu/is_ada_safer_...


Isn’t it its own language to a degree? Which may or may not be a bad thing in itself, but I do think there is value in having it natively available.


It is, but that's just Rust's style. The stdlib is deliberately kept small, so that an ecosystem would flourish.

In the design by contract case, there's a lot of experimentation (I linked this https://old.reddit.com/r/rust/comments/17miqiu/is_ada_safer_... in another comment) and while all of them are converging to the same syntax, they all have wildly different implementations.

Perhaps the stdlib could provide a base syntax for contracts, with extensibility APIs so that libraries or compiler plugins or external tooling can determine whether the contract is checked at runtime or at compile time, and by which method. But that demands a lot of design, because once something is in the stdlib and stabilized, it's here forever.


C# used to have postconditions and verify them at compile time. But this has been deprecated :( I wish I knew why they removed it.

https://learn.microsoft.com/en-us/dotnet/framework/debug-tra...


It was only available on enterprise versions, and most likely Microsoft has concluded not enough people were paying for it, nor they wanted to keep it for free, porting them into modern .NET runtime.


I wish more languages supported that natively, starting with C and C++.

The most typical example, in C, output arguments, passed by pointer, a pattern that I also use in C++ because I find passing by non-const reference error-prone since it is not obvious looking at the calling code that an argument may be modified.

But what about NULL? Sometimes it is a good thing that NULL is allowed, sometimes, it doesn't make sense. So again, not obvious. So what to we do? You can document it of course, but the problem is that compilers don't read the docs, and therefore can't tell you if you are doing it wrong. You can play it safe, and design your API so that NULL is always an option, and avoid NULL when using others APIs, but it leads to unnecessary and performance-impacting checks. Preconditions and postconditions would solve that problem: if you pass an argument that can be NULL to a function that doesn't accept NULL, the compiler can warn you. Plus, there is optimization potential, in the same way that C/C++ does with undefined behavior, but this is explicit. You also can also get the choice between safe (runtime checks) or fast (undefined behavior) at compile time.

Making it a core feature of the language rather than an extension (like your rust crate) will help compilers and other tools to take full advantage of what it brings.


https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2023/p29... C++ is preparing this for years, though it's uncertain when it will actually land in C++.


D has contract programming e.g. `in(param < 0)` before {

I don't think there's all that much that a typical compiler will do with them however sophisticated enough type system I can see some wins.


Microsoft's compiler has that ability with SAL annotations. But yes a standard way to do it would be nice.

https://learn.microsoft.com/en-us/cpp/code-quality/using-sal...


I've started using abseil null/no null

https://www.google.com/url?sa=t&source=web&rct=j&opi=8997844...

We have a linter that will try to catch cases where you didn't null check but the field is nullable


> I find passing by non-const reference error-prone since it is not obvious looking at the calling code that an argument may be modified.

VS code makes it pretty obvious by adding a visual & prefix to the argument. I hope that kind of visual help becomes widely adopted.


* https://en.wikipedia.org/wiki/Assert.h

?

C11 and C++11 also have compile-time static variants.


It seems like there's a very minor error under the "Why use this?" section:

    assert_gt!(value1, value2); // value1 ≥ value2
That should be ">" rather than "≥". (Or "assert_ge" rather than "assert_gt".)


You're right. Thank you! Fixing it now.


There is also anyhow::ensure!() (which is more of a requires() but anyhow...)


Why the specialized API, rather than (say) precondition(foo == bar)?


Merely because the "assert" wording is more familiar to more Rust developers because it's similar to typical test code, and can create better error messages with details.

For what you're asking, the assertables crate has a macro that will handle any condition:

    assert_as_result!(condition) -> Result
And you can alias it as you wish such as:

   use assertables::assert_as_result as precondition;
And now you can write what you're suggesting:

   precondition(a == b)


Probably to get better error messages, same as assert_eq, otherwise you need a procedural macro to take appart the entire expression which is way more complicated and more expensive.


I’m glad that domain-limited integer types are called out: I wish more languages had that.

At the same time: there’s a reasonable practical argument that both have essentially safe semantics, and that “safer” one is really just the more successful (easier to integrate into existing codebases, more online resources, &c.) one. Rust is (thus far) winning at that.


That feature really helps when you use integers for indexing, e.g., like an enum. Say you are parsing a file format with a type field which can have values 5, 6, 7, you can easily check if that's valid with a single call and lookup something, like a corresponding length in an array without having to translate those field values to array indices first.


Yep, although Rust manages to get there just fine with sum types and fallible conversion traits.


I’ve used Ada. I’ve not used Rust beyond a few hello world type programs. Personally, I like Ada quite a bit, and I really do like Spark. But I think that Ada and Rust are aiming at similar problems in entirely different ways. One of the key design choices in Ada was to emphasize maintainability. Rust’s answer was to emphasize memory safety. While the Ada compiler does quite a few safety checks, these aren’t part of the spec so that will vary by compiler. I am not sure that direct comparisons between the two languages are entirely useful. Ada is used for things like cruise missiles and jets. Rust is used for things like browsers and reimplementing GNU. Both are important, but they aren’t exactly similar.


ada has some things that rust doesn't. a lot of them (bounded numbers, pre/post conditions) are really good but do not present a case for writing the category of program rust targets. nor the other way round, really.

when the dust settles and hindley-milner types, rank-n types, substructural types, etc, all complete their respective journeys from "weird thing the academics came up with" to "state of the art in hip new language that managed to compellingly package it" to "fact of life", it is my hope that dependent types will follow suit -- something that bridges the ux gap between theorem prover and general-purpose language where anything from ownership and borrowing to refinement types lives under one HOL umbrella and lets you make just about any guarantee you like. i recommend anyone reading follows the f* project


For the curious (as f* is hard to search for): https://fstar-lang.org/


Dependent types are actually quite natural in the "compile time evaluated" subset of a programming language like Ada or Rust. For ordinary program code, they run into the obvious issue that dependent typing subverts the usual phase separation between compile and run time, in the absence of an ad-hoc "code extraction" step. Dependent types and contracts also promote "code" to what is effectively the typing level, which introduces thorny API-stability concerns - i.e. the exact way you express a pre/post-condition or invariant in a library API becomes something that downstream users can end up relying on. Apart from that, Rust is already getting there with their advanced const-generics work.


One problem with dependant types is having to write the proofs, but perhaps AI can help with this. The type system can be the world the AI lives in, and we don't have to check the proofs too closely, only know that they proved our high level assertions.


I would definitely want a CI pipeline that randomly fails depending on what day ChatGPT has

(Yeah, I know about temperature, but AIs are dumb as hell especially at logical inference stuff, so I just went with the comment in a satirical way)


You misunderstand. When working with dependent typed languages sometimes you have to prove things. For example, my function might receive a sorted array, but it has to be proved that the array is sorted, otherwise this is a compile time error--remember, all of this is happening at compile time. So, you write the proof using code and check it in with the rest of your code. It's not something that's going to fail randomly, it's just code.

If AI wrote the proof, this would be no different than having Copilot fill in some code, it's just code you check into the repo.

These proofs often involve writing code, any code, that will pass the type checker. If you can write code, no matter what that code is, and it passes the type checker, the assertion is proved. It's such a well structured problem, "just write code that satisfies these type constraints, any code will do, as long as it type checks", that I think AI might be able to do it quite well.

I've commented on this before and it wasn't well received then either. I think most people have never coded in dependently type language and don't appreciate how often you simply want to fill in the holes and connect the dots with any code that type checks. It's been several years since I experimented with dependent types, so I forgot a lot of the details, but I do remember often wanting to just "fill in the gaps", and indeed, there are IDE tools that can often fill in the gaps for you.

(Another thing I was surprised to realize about languages with strong type systems is how often there is literally only one possible implementation of a function.)


I have seen demos of Idris where just the type signature itself allows for automatic function implementation, which is very cool and almost magical.

And you are probably right that many of the trivial cases can perhaps be guessed by some AI — but I think in these cases a simple, more deterministic algorithm could also find a match, iterating through a few proving strategies (I believe this is how the aforementioned idris demo worked). What might have resulted in your downvote (I didn’t downvote your comment, I think it adds to the discussion!) is that many people are tired of the over-hype of LLMs, and taken your comment blindly as support of that.

With that said, I do think that you overestimate the capabilities of LLMs, as my go to example: they can’t solve even Sudokus. And any other similar, “recursive” thought is simply fundamentally impossible by LLMs in a single step.

It is also the barrier in the popularity of dependently types languages - it is easy to give a proof of List::head. Database::connect on the other hand goes through multiple layers of the whole stack, and function proofs do not compose — two trivial to prove functions composition might be impossible to prove, e.g. I can compose 2 such functions and get some unsolved math problem of your choice.


LLMs can't solve Sudoku, but neural networks plus tree search can solve Go. As you say, traditional techniques, like a simple tree search can already solve a lot of the "holes" that come up in dependent languages. Perhaps something like AlphaGo combined with a LLM (to accept human guidance using language) could work?


> LLMs can't solve Sudoku

Seems like ChatGPT-4 can? And that's not the advanced data analysis model, either.

https://chat.openai.com/share/3126e737-3ae5-4f63-996b-b5a768...


I was curious. I generated a very hard Sudoku puzzle and gave it to GPT4, which proceeded to write and execute a Python program to find the solution.


Which is not the same as it solving it, as there are a million sudoku solvers in its training set.


You can still make mana on https://manifold.markets/Mira/will-a-prompt-that-enables-gpt... if you believe LLMs can't solve Sudoku!


If you don't use the heap, which is common in real-time control, Ada has real advantages. That's what Ada is for, after all.


Eiffel had design by contract from way back.

https://en.m.wikipedia.org/wiki/Eiffel_(programming_language...

https://en.m.wikipedia.org/wiki/Design_by_contract

Unfortunately it suffered from some of the same issues as Ada:

For a long time, only expensive implementations available, and a somewhat verbose syntax.


The proof of the pudding is in the tasting. I can't find any evidence online that anyone has ever used Rust in software where its correctness needs to be certified, and audited with something like DO-178C. I think that settles the debate. Maybe Rust could be as safe as Ada, once the Rust type system has been proven, and there's a verified compiler. Those things haven't happened yet though, so it's not likely to be used in a safety-critical environment anytime soon. Ada has been used in this area for decades.


You might be interested in https://ferrous-systems.com/ferrocene/, which " is a ISO 26262 (ASIL D) and IEC 61508 (SIL 4) qualified version of the existing open-source compiler, rustc, based on Rust 1.68."


You were one minute faster :). Golf clap.


It is just a matter of time and one important step has been achieved recently.

https://ferrous-systems.com/blog/officially-qualified-ferroc...


Oh, pardon me. I didn't realise Rust had a qualified compiler already. It looks like this is what Ferrous Systems was working on with AdaCore.


Do any languages besides Ada use an identifier naming convention that merges PascalCase and snake_case like Ada_Identifier_Name? Popular languages like Python, Ruby, and Rust already mix both PascalCase and snake_case (for class and function names, respectively), so consistency isn’t the reason.

This naming convention avoids acronym capitalization ambiguities and reads more like natural written language. For example, instead of normalizing the inconsistently-capitalized identifier XMLHttpRequest as XMLHTTPRequest, XmlHttpRequest, or xml_http_request, you would use XML_HTTP_Request.


I like it quite a lot while reading, but it's a holy hell to keep reaching between shift and underscore. Besides, what do you call an iterator variable? I? i? Idx? Index? Iterator_Index?


Map shift + space to underscore?

In my .vimrc, I mapped space to ; to enter VIM command mode:

  noremap <Space> :


Correction: I mapped space to colon : not semicolon ;


There's a solid argument that memory-managed languages like Java are safer than Rust, because although your program can crash, it can't segfault unless you're doing something especially weird. Safe Rust programs can also crash via `panic!`, and unsafe Rust programs can segfault and create wonky UB just like C programs if the safety invariants aren't manually enforced.

The safest language would be something like Elm or Datalog, where the type system ensures crashes aren't even possible. But then you just make up your own idea of failure, where instead of crashing the program produces unexpected output, almost like UB.


Rust's panic is technically very similar to a Java exception. It does not lead to any unsafety. It throws before executing anything that would be unsafe.

Safe Rust can't segfault, unless you're using unsafe code that is buggy, which isn't that much different from using buggy JNI.

Beyond just memory safety, Rust has a stronger type system than Java — tracks mutability, ownership, and thread safety that Java doesn't.


Can't you segfault in Java via stuff in sun.misc.Unsafe?


Easy to compare:

Search the gnat bug tracker for stack overflow: 14 bugs https://gcc.gnu.org/bugzilla/buglist.cgi?quicksearch=Ada%20s...

Rust: 243 (plus 830 closed) https://github.com/rust-lang/rust/issues?q=is%3Aissue+is%3Ao...

Choose by yourself which seems safer


Certainly a compelling demonstration that Rust is much more popular than Ada.


So I guess the safety belt is also much more dangerous than not having it, as more people have accidents with seatbelts on? Similarly faulty logic.


Besides the obvious reasons this is an invalid comparison I found the Rust bugs returned by your search quite interesting. For some reason that term seems to disproportionally turn up very poor bug reports. It's mostly compiler crashes caused by weird issues on the reporter's machine. For example: "I cannot reproduce this on rust tarball sources outside our Yocto build environment."

Edit: Part of the problem is your search term is `stack overflow` not `"stack overflow"`. Many of your results are issues where the reporter included a stack trace and someone mentioned the word "overflow".


Well I've never blown through the stack in Assembly.


I noticed that many people praising Ada on the internet don't actually know the language.

It's odd. Why advocate for a language you don't use?

Here are some common giveaways:

- "Ada has a garbage collector" - It's optional part of the specification, in practice most compilers don't implement it.

- Confusing SPARK (the formal verifier) with Ada, or even claiming that SPARK is now part of the Ada spec. This is simply not true.

- Not knowing that contracts are enforced at runtime and they're typically disabled in release builds.

- Claiming that Ada is memory safe. Ada doesn't have a borrow checker. Ada doesn't have smart pointers out of the box (unique_ptr is actually unimplementable due to lack of move semantics). Typical non-embedded codebase is full of "new" and "Unchecked_Deallocation", usually wrapped with controlled types, much like C++ RAII.


>>Ada doesn't have smart pointers out of the box

I am a novice-intermediate Ada user (and I started with SPARK before moving to just Ada), but a lot of the arguments in favor of Rust mention that "there is a crate for that". I would say you need to compare vanilla Rust "out-of-the-box" with Ada for a fair comparison.

Another reason I prefer Ada is because I find it boringly simple like Pascal even though it is verbose, whereas I find Rust very obtuse, and I have a personal bias for array languages and ML languages. I wish Rust were more F#-like.


For the case of "smart pointers", and the larger point of "safety": rust has that included out of the box: Box, Rc, Arc, RefCell are all smart pointers included in the standard library. On safety of the language: the borrow checker exists, move semantics are the default (ie: included in the language) and copy is explicit, and real programs that isolate unsafety effectively (ie: so it doesn't need to be known outside of the module it exists in) are common.


Basically unique_ptr is the default in Rust. I also like to say that Rust is “just” the compile-time enforcement of C++’s smart pointers, which is technically not 100% correct, but gives a good mental image for the uninitiated, I guess.


edit: actually not RefCell. That's more Mutex like. Which is another place rust solves: the tooling for eliminating data races at compile time is mature and pretty complete. The race elimination in Ada appears to rely on using SPARK (as noted elsewhere: not universally used from others experiences), but I'm not too familiar, possible I'm wrong there.


> but a lot of the arguments in favor of Rust mention that "there is a crate for that". I would say you need to compare vanilla Rust "out-of-the-box" with Ada for a fair comparison.

Smart pointers are vanilla rust shrug.

But also, it's pretty fair to mention the availability of crates, because they're easy to integrate, liberally used by the project itself (go browse https://github.com/rust-lang and you'll see a number of projects for libraries distributed as crates rather than part of the stdlib), and there's a large breadth of available packages.

The internet tells me Alire is a (the?) package manager for Ada, its listing has 373 entries right now (https://alire.ada.dev/crates.html).

Simple odds say you're more likely to find a crate for that than an ada package.


> But also, it's pretty fair to mention the availability of crates, because they're easy to integrate, liberally used by the project itself

Yes

This is a strength and a weakness in Rust

The crate ecosystem is broad an deep. It is seemless to integrate crates if you are connected to the internet. That is a strength

But it is also a weaknesses. "seemless to integrate crates if you are connected to the internet" but of unknown provenance and varying quality.

There are steps you can take, tools you can use to ameliorate the issue, but it is a real danger

It depends what you are working on. For systems which must be secure and reliable -weapons, flight control, banking, maybe Ada is a better choice

For me I make musical instruments, and I will stick with Rust for that

I know Rust, not Ada. A very interesting discussion for me to


>I noticed that many people praising Ada on the internet

Not seen that anywhere, on Reddit or HN. I mean Ada is rarely even mentioned. On HN, Ada sometimes is hated by many. And this is speaking as someone who gets all the fingers for bringing up Ada whenever someone said Rust is the only PL for safety critical applications.

I am actually rather surprised this Submission gets some many people commenting.

And to be honest I think most of those point aren't because they dont know the language, it seems to be more like lacking in context. ( Which is often the case in any online discussions )


Your last point is the largest thing I noticed in the one large codebase I looked at. Ada with a borrowchecker and linear types (aka austral) would be great. Rooting for austral, though I actually prefer obtuse dense syntax to verbose simple syntax. Not sure why. I think I may have a mental cache size related to the number of characters on a screen vs the number of tokens.


I figure the preference in syntax is a trade-off between how hard it is to learn and how hard it is to read day-to-day.

More regular and verbose syntax is easier to learn, but requires more actual reading when you use it.

More concise but complex syntax takes longer to internalize, but then you can see more at a glance, so it lets you do more visual pattern matching and puts less strain on working memory.



C++03 had no move semantics either, it got added in a later version of the standard. Controlled types/RAII is also the basic building blocks of smart pointers, and it's quite normal that they be implemented with unsafe code under the hood.


A garbage collector was also part of C++ spec from C++11 to C++23.


Well when people say "Ada" they usually mean SPARK not old or feature limited version of Ada.

-Spark do verify contract using static analysis and proof. -While Ada is not as memory safe as Rust it's a lot more memory safe than popular languages. -Its one of the easiest languages to get compliance with safety standard DO-178C or DO-333 ...


> Well when people say "Ada" they usually mean SPARK not old or feature limited version of Ada.

Having worked on several Ada projects professionally I can tell you that this is definitely not true.

DO-178C applies to the process and does not make assumptions about programming languages.


> -While Ada is not as memory safe as Rust it's a lot more memory safe than popular languages

Popular languages typically have garbage collectors and are memory safe. If a language is not memory safe, like Ada, then it's automatically one of the least memory safe languages, measured by amount of use.


> Popular languages typically have garbage collectors and are memory safe. I

Yes

And those languages are unsuitable for real time systems

Where Rust, and it seems Ada, excell


For hard real-time normal Rust (and I guess also Ada) is insufficient. This is such a niche of a sub-niche, that you no longer write the same language as for everything else, most OSs already preclude hard real-time by their very existence, etc.

Soft real-time is absolutely doable with a GC, depending on the required targets. Java’s low-lat GC (ZGC) has <1ms max pause times - you might not make the latest AAA game with it, but for the vast majority of popular games that is more than good enough.


Ada has inbuilt features allowing you to restrict yourself to a safe subset of the language for real-time computing.[0] It can be argued that this is not "normal" Ada since you are restricting the use of the entire language, but the restriction profile itself is a feature of the language.

[0]:https://learn.adacore.com/courses/Ada_For_The_Embedded_C_Dev...


> For hard real-time normal Rust (and I guess also Ada) is insufficient

Really? Expand on that please. I did not think so

> Soft real-time is absolutely doable with a GC

Some slippery definitions here. Clearly there is a middle ground (where I live) where rust works. GC languages do not

One GC pause, ever, is too many


You have 1ms pauses easily from simply the OS context-switching away from your program, so as I mentioned, unless you explicitly tune your OS, you have this kind of pauses, no matter the language, even if it’s hand-written assembly.

Regarding the non-normal language: with hard real-time you wouldn’t want to allocate willy-nilly and rust doesn’t yet have generics for allocators (correct me if I’m wrong on this, I am not up-to-date here), so careless usage of the standard lib, or other crates is a no-go. For illustrative purposes, this is the same with C++ and `noexcept`.

> Some slippery definitions here

I don’t see the slippery part - it is a slope, though. Depending on what’s the latency requirements and to what percentile should it happen, you can use Rust (and other low-level languages) on the lower end (audio processing), some managed languages in the middle (many kind of games are absolutely fine with that - a single drop of frame is more than fine, hence the soft rt), and pretty much any managed language on the high end.


>You have 1ms pauses easily from simply the OS context-switching

It is routine for Rust code to run without an OS (e.g., on microcontrollers).


A subset of Rust. You can’t just write ordinary Rust for microcontrollers.


It depends on what you call "ordinary Rust", I guess. You can use an allocator on embedded, but likely won't want to. There are static types that provide the same APIs as the allocating data structures you'd miss[1]. Granted, you're not gonna be able to take any random crate that isn't designed for such an environment, but that's a different problem, and there's a fledging ecosystem of no_std crates out there. Despite all those caveats, you can still write very high level code in embedded Rust, even async/await state machines![2]

1: https://docs.rust-embedded.org/book/collections/index.html

2: https://interrupt.memfault.com/blog/embedded-async-rust


Ada is awesome! One of the greatest languages. Should be much more popular.


Sadly it suffered from high prices, and being extra tooling even on UNIX SDKs, until GNAT came to be.

Still, not all languages can assert having 7 commercial vendors in business, lasting 40 years and counting.


In case others were curious to look up the timeline, WP says first validated GNAT release was in 1995, vs 1983 for the earliest validated implementation.

(but perhaps the earlier stuff wasn't practical: "Early Ada compilers struggled to implement the large, complex language, and both compile-time and run-time performance tended to be slow and tools primitive" ... "By the late 1980s and early 1990s, Ada compilers had improved in performance")


Another issue that delayed adoption in regards to GNAT specially was the licence, somehow debatable, similarly to what happened with Qt, or Java implementations.


Pretty much any job using it will require security clearance ...


This is your first comment in 11 months and 1 of only 4 over 10 years. What is great about it? What do you like?


It might be because I have "showdead" enabled but I see that there are many frequent comments from this user when clicking their page. Every comment I see from them seems to be flagged/dead though...


[flagged]


I mean there are few languages as safe as Haskell. Idris maybe even safer. Depending on what "safe" means for you.


I was just making a joke about the logic from the post.


[flagged]


There is a lot of value in constraining what the programmer can do. It's not always about getting to minimum viable product as fast as possible, in some fields you actually need to care about reliability and security. We keep finding security vulnerabilities in "battle tested" foundational software - it would be great if we could stop, or at least severely reduce that some day.

The state of software today is more like having several layers of swiss cheese and hoping that the holes don't happen to match anywhere.


> Programming languages are tools to get shit done.

I, on the other hand, find it baffling that some programmers, despite the fact that programmers generally make computers useful for other people, don't care how computers can be more useful for them.

Those guys in the "square wheels comic"[0] are also getting shit done.

[0] easy to google


> don't care how computers can be more useful for them

My guess is that the OP doesn't see this as a computer being more useful for them, but instead sees it as being less useful as it requires more work to get the same result.

And if you're in a domain where crashes aren't necessarily a problem, or where security isn't a top priority (non-public facing tools often fall into this category), correctness (in the general case) and safety don't automatically map to utility.


> a domain where crashes aren't necessarily a problem

So an application that doesn't deal with user data? Those are rare, but do exist.

> where security isn't a top priority (non-public facing tools often fall into this category)

So nothing accessibly remotely, no connecting to a server for influencing its behavior, no auto-update, no cross-user collaboration. I sometimes hear "this doesn't need safety, it's just a multiplayer game" and don't know how to answer in a constructive way. Even an application like grep can be exploited if an (hypothetical) unmitigated coding mistake exists and it reads a specially crafted file that somehow ended innocuously on your filesystem. The threat model is much bigger than you're giving it credit.


> I roll my eyes every time I hear about yet another programming language for which the biggest invention is that it does not allow null pointers.

> It is even worse when a programming language manages to solve an easily preventable problem by creating additional workload on the programmer.

What programming languages are you referring to here? Both Rust and Ada do more than disallow nulls, and help solve some non-trivial problems. Of course at the expense of additional constraints, but that's probably unavoidable.


I always wondered why you brilliant people who "get Lisp" never actually released anything written in Lisp.


Right, and those C programmers talk big but there's no software written in C, where do they get off?

...I really don't understand this comment, there's so much Lisp code out there being used for everything from airfare search to video games, it seems willfully ignorant to say nothing is released in Lisp. I know, lucky 10,000, but still...


No there really isn't. Naughty dog used it 30 years ago, but lisp is not used in gaming


There's Cake Lisp, which was used by its creator to make an iPhone game for his girlfriend one time.


No one pays me to.


Well in theory, honestly, we hacked most of it together with Perl.


Having a more expressive type systems does not make a language automatically safer.


In the same way that thicker armour isn't necessarily safer.


No, it is more like having the tools that can technically create better armor does not neccessarily mean you will on average create better armor than someone with tools less capable.


Ok but when people say "is Ada safer than Rust" they obviously mean when both languages are used by competent people who are familiar with the language. So no, it is not like that.


Even competent people make mistakes and not everyone working on the project will be competent in everything.


That's exactly what "safer" means in this context.

A safer language is one in which the probability of a competent person making a mistake (that makes it into production) is lower.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: