Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
What’s to love about C? (mortoray.com)
139 points by nkurz on June 11, 2012 | hide | past | favorite | 151 comments


I am going to make the bold claim that everyone should know C. Even in a world where we can do 90%+ of our coding in a higher level language we still aren't at the place were we can get past writing out performance loops in C. Case in point my company recently needed to speed up a 128bit AND on two MySQL binary fields. While this can be done in pure SQL implementing the function in C saved an order of magnitude of time. Everyone is going to run into these performance problems and until we have run times that can match hand coded C it will in my option still be a required skill to have.

( Yes I know you can s/C/ASM , however this is a time issue eventually we may get past needing C , but we are not there yet )


It's funny, I tend to assume (because it mirrors my own experience, and my peers when I entered the industry) that everyone starts off as a C-on-Unix programmer, and specializes from there, into databases, or Windows programming, or whatever. But this isn't actually true, and makes people go cross-eyed when I say things like "just strace it" or "attach gdb and get me a backtrace". I grew up doing 6502 assembly language. Kids these days do HTML and JavaScript. They don't even know how far away from the machine they are, and when the "magic box" doesn't work, they're stumped.

A good programmer has the mentality "it's all turtles, all the way down", where a turtle means a thing that ultimately is just a piece of code, written by someone much like yourself, than you can understand inside out if only you take the time to do so. Even the CPU is just code (e.g. VHDL).


My programming started on windows trying to learn java (classpath? wth is a directory??), then to dos prompts and rexx scripts, but I didn't really hit my stride until a neighbor helped me install TurboLinux.

I spent the next few months at the console (configure X, what am I a god?), with emacs, and gcc and it was glorious.

I still remember the day when I figured out I could declare a function in one place, and then define it in a different .c file, compile things individually and link them and it would work!

So yes, I agree I think everyone should learn C, since it's still the best, thinnest model for the Von Neumann architecture a typical computer provides you. And those early painful steps with C have really served me well when it comes to visualizing the mechanics of a problem


> I am going to make the bold claim that everyone should know C.

I agree. Anyone who's going to be working at a non-entry-level job in the computer industry should know what the foundations are built on. C still dominates in the kernel space and other low-level facilities everyone uses. If you don't know it, you are at a disadvantage compared to those who do.


> I am going to make the bold claim that everyone should know C.

I agree.

The world of software is built on layers of abstractions. Both the skill of effortlessly layering abstractions and also the skill of peeling them away are fundamental to deep programming and modelling skills. Languages like LISP open the door to effortless modelling, whilst a language like C strips implementation all bare.

Its nice and widely available, fairly neutral on programming style (in the imperative space - few "political" (eg objects) abstractions rammed down your throat that don't actually exist in the CPU).

It can be used in a semi-portable manner, and inter-operates easily with assembler as required.

Boxing, GC, forced classes, and so forth still are not free. They are close to free in many problem domains - but you cant make them go away in other languages when they get in the way.


Pah, 99.9% of programmers will never, ever, ever need to drop down to that level.

Where do these delusions even come from? I think a lot of people here would benefit from 6 months in a 'normal' company, the kind that employs the vast majority of programmers. Just for a bit of perspective.


These delusions come from experience with watching programmers flop around like fish out of water whenever some layer of abstraction they took for granted starts "leaking".

I honestly don't get this attitude in our industry. Nobody tells a general practitioner, "Nah, you don't really need to know what comprises a living cell. You most likely won't even need to drop down to that level."

Just because you don't need to do any low level work, doesn't mean you shouldn't learn anything about it.


This is exactly the sort of nonsense I'm talking about. When, you fool? Do you think most business apps need to do some crazy algos or something that would benefit from C?

If you think most programmers who know Java, C#, Ruby or, Python have ever dropped down to C you're deluding yourself. And they never need to and not only that the majority of their performance problems come from poorly written SQL queries. And the rest come from poorly conceived loops.

As I said, go get a job in a normal company, open your eyes. Most job applications will not give you an advantage if you put the ability to drop down to C on your CV. Because they will never, ever, ever need it.


"Where do these delusions even come from?"

Probably from those of us who have found a working knowledge of C fairly useful in their careers?

I've been working for 24 years in software and I'd say that during most years I've written or read C sources. Some years it would be a lot, some years relatively little, but it's been a very useful skill for me.

I agree that relatively few people need to write C these days - but knowledge of it does, in my opinion, help quite a lot.


> I think a lot of people here would benefit from 6 months in a 'normal' company

I've worked several years at a normal company. My group of dozens of software engineers codes overwhelmingly in C, with Python, Perl, Lua, and the occasional ML in support roles.

I think a lot of people here would benefit from 6 months in a 'normal' company, that employs the vast majority of programmers. There's so more to the universe than Rails, Django, and JQuery...


> Pah, 99.9% of programmers will never, ever, ever need to drop down to that level.

That is true. Most programmers will not need to write code in C. But they'll still be better and more versatile programmers if they know the language.


Pretty much any extra skill will make you a better programmer if you disregard the time put into learn it, so that's hardly a high bar. What about comparing the marginal return of an hour spent learning C with an hour learning best practices in their own particular field, whatever that might be?


> Most programmers will not need to write code in C. But they'll still be better and more versatile programmers if they know the language.

I agree, although I'm not sure C should be the top priority for someone that will never need to actually use it. There are lots of other things to learn and nowhere near enough time to learn them all.

Although learning C when you already know other languages is not particularly difficult.


When I needed to fix a bug in the mono vm, guess what language I was using. Or apache. Or SQLite. You'll notice I'm not talking about the language my app was programmed in. I won't even start on the abomination that is the typical ruby gem ext directory.

If you had said 99.9% of programmers should not write C, I'd agree. But the reality is there's a lot of numb nuts out there, and you're going to need to fix their code. If you can't beat em, join em.


Completely this. On a similar note someone I follow on twitter posted that they thought it was important that every programmer knows the response times from L1, L2 and L3 cache...


I'm part of the 0.1% :)


I think programmers should know the basics of C even though most of them will never work at that low a level. If you don't understand C, you probably don't understand any modern imperative languages, practices, or patterns. If you don't understand imperative programming (what it is, where it is good, where it is bad) then you don't understand high-level or functional programming.

Also, C isn't a complicated language. Like any, it has faults and warts, but it's pretty damn elegant on the whole.

C++, on the other hand... don't get me started.


If you ever have to deal with a program that crashes (e.g. segmentation fault, etc) then the guy with the knowledge of C and the tools (gdb etc) will run rings around the guy who only knows a scripting language.


That's not a bold claim, that's a very safe claim. Learning C won't hurt anyone, except for the missed opportunities, which are invisible. But if you take all the "everybody should learn X" claims literally, you'll spend all your time learning.

So here's my counter-claim. Most Java and .NET developers will do just fine without knowledge of C.


Until you need to do something novel or unique. how useful would .NET be without pinvoke ? how many Java extensions exists out there ?

Above and beyond the practicality of being able to extend your language or debug it when it brakes in new and interesting ways, you also just have the general knowledge of how computers work that much more. This isn't to say you aren't any good, but if you want to be that much better you should learn C.


Server-side and in command-line tools there is tons of software that need not explicitly call into the lower level stuff. The code you call will eventually do that, but as far as the programmer is concerned, that is of no importance.

GUIs may still be a different matter, but I haven't programmed them in .NET/Java in years.


What's "just fine"? Paying the bills?


That claim doesn't really tell me anything without a measure. How much C should everyone know? I passed an introductory course in C, but that far from qualifies me to write any production-ready C. Every day I see some thread on C pitfalls here, many of which I would probably fall into once. What do you think someone should be able to do in C?


Generally knowing the syntax, how memory allocation and pointers work, the basics of calling functions, how to read the output of a back trace.

Furthermore not C directly but they should know how programs link and execute


Yes, and one of the reasons that this is a good idea is that the opportunity cost is so low. C is a pretty simple language. Mastering it is hard, but the number of rules you have to learn just isn't that big.


So maybe Bloomberg should learn C?


inb4: "Please don't learn anything, especially C".


After using C++ for years (in addition to Python, Haskell, etc) I've recently converted back to C for my low level tasks. You just gotta love the simplicity of C programming. My jobs consists of reading blobs of binary and throwing it around different buffers in memory (mmap's and dma buffers, etc), so C is a natural choice.

I like how C is fast and dangerous, there's no silly security layers (like Java) and it lets the operating system do it's job. Fast stack allocs for temporary data are great but deadly if you fuck up.

Coding in C is like carrying a loaded gun pointed at your feet. You have to handle it with care and responsibility but when you need it, you can shoot away any extra limbs that get in the way and show 'em who's the boss.


<Coding in C is like carrying a loaded gun pointed at your feet.> Could not agree more. Except I would say pointed at someones head, or, perhaps, your own.


Coding in C is like being presented with a grid of buttons of various shades of blue and green. You know to press a specific colour, but it doesn't always do quite what you expect, and half the buttons segfault.


People who love C love it, in part, because they are often solving interesting problems with it. Many problems that require C are interesting: embedded software, automated robots, new databases etc. Most problems that C is not good at - Web development - are not that interesting to many people that love C.


C is not good at web development? Most of the "web" stack is written in C: OS/NGINX/apache/php/java/perl. If you do write the top of the stack in C, it will run faster than anything else and be smaller. This means lower turnaround time which is VERY IMPORTANT in web apps. C, in many cases, is the best Web language.


In what world does runtime speed and compiled size have anything to do with turnaround (or programmer productivity, as I read it)?

C is a great language for many reasons, but let's not conflate benefits.


Perhaps there's a misunderstanding in the term "turnaround". I meant it in this sense t = x + y + z. where t = total time, x = send time , y = processing time, z = send back to user time. If you can decrease y, you decrease "t". Responsiveness is important to users. With the advent of large and extremely fast memory cache, small program size will dramatically increase program execution speed by have small code and small data already in cache. Smaller program executable will enable many more instances on a real physical node. Size matters. Speed matters.


I wrote a CGI in C around 2005. The C code produced only a small about of XML. All the presentation layers were done using client-side XSLT, CSS, and a bit of javascript. The result was very portable and efficient: we had to support serving from z/OS mainframes that used EBCDIC, AIX, HP-UX, Solaris, Linux, and Windows.


None of those require C, and are frequently done using high level languages.


Your comment may have gone over better if you included specific examples. For one, here's a story about Lisping at the Jet Propulsion Lab: http://www.flownet.com/gat/jpl-lisp.html


The thing I love about C is that it the most obedient of all languages and never assumes you are an idiot. Never does it tell me I'm about to blow my own foot off - it just does it and I've learned from the experience. Due to this it taught me how to think about things first and know of all the consequences of doing everything.

*NIX is the same.


Doing exactly what you told it is fine until you hit one of the many undefined behaviours and then it does exactly what you thought you told it only some of the time.

There's a lot of undefined behaviours to know about, and not all of them are obvious. You can be merrily compiling your programs fine on GCC and Clang and your intended behaviour seem entirely obvious, only to have it blow up on another compiler or with some different compiler flags.

That said, the undefined behaviours do allow the compilers to do some very tight optimisations, so swings and roundabouts.


That tends to matter less if you have a decent test harness and plenty of assertions!


Not everywhere C's undefinededness lies; e.g. shifting more sizof(int) bits - in case the shift parameter is dependent on your input.

"Decent test harness" is like "sufficiently smart compiler". Everyone assumes it will be there when planning, but in practice it is only available in very specific and not-generally-useful cases.


Here's a one-file, ISC-licensed test framework I wrote for ANSI C: "greatest" (https://github.com/silentbicycle/greatest). It doesn't depend on any dynamic allocation, or anything beyond C89, and compiles with zero warnings under -Wall -pedantic.


Hadn't thought about that. I haven't done much C since I discovered proper automated tests.


The following program compiles without warnings using gcc 4.7.0. The compiler uses the undefined behavior of accessing a variable that may not have been initialized to turn the conditional into one that's always taken, and then folds it away.

https://gist.github.com/2910012


valgrind:

   ==46592== Conditional jump or move depends on uninitialised value(s)
   ==46592==    at 0x100000E90: foo (in ./a.out)
   ==46592==    by 0x100000ED3: main (in ./a.out)
The matter is, valgrind exists. Everything is a C deficiency that valgrind is able to overcome decently is no longer a C deficiency practically speaking.


Not true. Valgrind is a dynamic analysis, while these kinds of errors could be caught by static analyses.


+1 I run valgrind on all my test cases.


clang does catch this:

  clang -Weverything uninit.c

  uninit.c:3:6: warning: no previous prototype for function 'foo' [-Wmissing-prototypes]
  void foo(int bar) {
     ^
  uninit.c:10:6: warning: variable 'lol' may be uninitialized when used here [-Wconditional-uninitialized]
        if (lol == 7) {
            ^~~
  uninit.c:4:9: note: initialize the variable 'lol' to silence this warning
        int lol;
               ^
                = 0
  uninit.c:37:3: warning: no newline at end of file [-pedantic,-Wnewline-eof]
  */
  ^

  3 warnings generated.


Glad to hear it--I've been looking into using clang more... This might just tip me towards it.


Interesting. Our compiler catches it, though:

  $ cparser -O3 -Wall -Wextra -pedantic test2.c -o uninit
  warning: ignoring gcc option '-pedantic'
  test2.c:3:6: warning: no previous declaration for 'void foo(int)' [-Wmissing-declarations]
  1 warning(s)
  test2.c:4:6: warning: 'variable lol' might be used uninitialized [-Wuninitialized]
Looking at the GCC bugtracker, there seem to be various warnings missing. http://gcc.gnu.org/bugzilla/buglist.cgi?quicksearch=uninitia...


There is something wonderful about low level programming langauges. I enjoy writing code and knowing exactly what the ASM is going to look like.

Python is fun, but I've got no idea how it is doing what it does.


This is very true but in my experience it's not always a good thing. It often leads me to premature optimization for instance.


I've been working on some ideas about how different languages lead to different sorts of premature optimization problems. I see a lot of OOP problems where developers try to avoid re-initializing or re-instantiating an object, and end up with weird partial-initialization or wrong initialization edge cases. I think language structure encourages this, and have been thinking about what other sorts of premature-optimization problems languages encourage.


This is exactly why I'm learning the various ASM right now. I am not content (depending on what I'm trying to do, of course) with a program just doing "something" that works.


How often do you need to know what it's doing?


Usually when something doesn't go as quick as you were hoping, profiling is unhelpful and you have to take the python bytecode to bits to find out what the hell it's doing inside.

That happens quite a bit these days for me.


Oh, and this. For sure. Why is everything so slow?!


> Why is everything so slow?!

Urf tell me about it. Does my flipping head in.


I used to play around with 3D graphics in real-time on the 486. I needed to know exactly what was going on to even get 30 fps.

I managed to get one of Michael Abrash's (ID software)loops to run 0.5 clock cycles faster, which I was pretty proud of.


When you are attempting to do a very specific thing. Not going to go into what that might be, unless you're curious, but sometimes you do need to know exactly or as close to exactly how your code is operating.


  * Lots of programmers you can hire today (contrast to D, ML, Haskell )
  * Excellent introductory texts (Kernighan & Ritchie)
  * Backed by giants like Microsoft, IBM
  * Everything else always has some mechanism to link against it
  * Thorough API documentation (if quite large) MSDN, man 3 sprintf


I wouldn't quite agree that Microsoft are a good example of "backers" of C. Unless, of course, you consider backing for C++ to be enough. Microsoft's compiler doesn't support C99, and likely won't do so any time soon (http://connect.microsoft.com/VisualStudio/feedback/details/5...).


I don't mind the way MS does C .

Whacking variable declarations at the top of the scope becomes second nature quickly.

I came from assembly language, where the process is very similar and not too demanding


Microsoft not supporting C99 is a huge issue because the world of open-source C projects that are supported on both Unix flavors and Windows can not integrate C99 features into their core because Windows support means building with MSVC. It might not be a big deal for one individual project, but you're literally talking about the entire world of open-source C projects being restricted by their decision.


In this case (open source projects using C99) they can use mingw.


An ironclad rule of platform-specific development: You use the compiler that everyone else is using. On Windows, that means Visual Studio.


I'm really happy for platform-specific development, and I'm gonna let it finish, but cross-platform development is the most cross-platform development of all time.. of all time!


I think its a lot harder to find good C coders then you might think. For example 80% of people who claim to know C that interview with my company fail a test on reading a CSV file.


But many of those 80% probably just haven't needed to use C, or read a CSV file, in a while. If someone randomly asked me to do that in Lisp, when I've been coding in Python for a year, it would take me a while to remember how to do it. But after working with Lisp for a week, I could do it easily again.


You don't even need a week if you have access to the internet, which as a dev you will. (Or with just Clojure's repl-docs you could probably come up with `(with-open [rdr (reader file-name)] (doseq [line (line-seq rdr)] (let [parsed (.split line ",")] ...)))` fairly quickly, and even if you forgot the specific reader syntax remembering the more memory-intensive `slurp` for a year seems doable and ought to be acceptable in an interview.) If you hadn't been doing Python for a year would you still remember the existence of the CSV module you can import, and then use dir() and help() on if you forgot the exact syntax?

The particular example of reading a csv file is just a bad one to compare C and any dynamic language with, because in C file IO is a pain in the neck with a lot of pitfalls whereas dynamic languages make it lovely and you only have to remember a small amount. I guess it's a decent test to see if one's potential C coder has memorized all the details about C's file IO, maybe the company does a lot of that since they create libraries for others or something. (Did you allocate enough memory on the stack/heap? Are you reading byte-by-byte looking for a newline or in chunks before looking? Are you checking if you need to re-alloc? Do you support "\n", "\r", and "\r\n"? Are you checking for E_NO_MEMORY and E_BAD_SOURCE? Are you using sprintf safely? Are you checking for end-of-file correctly? Are you tokenizing the line properly? Are you faster than Python? Etc.) If the interview is just looking for coders who can "fopen, fread with malloc, and fclose, it's okay if details are forgotten", then it's just a weeder question rather than a skill test and there are better weeder questions.

Personally I have a handful of C programs on hand that I know are correct and that do various IO stuff, I tend to just copy from those on the rare occasion I need to use C to do manual file IO or memmapping (we all remember the subtleties of that right?) because of the reasonably high probability I'll forget one of the many details if I do it from scratch. I much prefer using already existing libraries to read and parse for me; there's too much reinventing-the-dysfunctional-wheel culture in C.


I used Lisp and Python as examples because I don't know C myself. In any case, the point was that when people say they know a technology, they might mean that they can answer any question you can come up with, or they might mean that they once knew it, and can become proficient with it again in a short time. Of course, some people don't actually know anything about it, but I doubt this number is quite as high as 80%.


I can believe the 80% number. I don't know any formal studies (surely there have been some by now?) but I've heard and read a lot of anecdotes (like the one in this thread) from technical interviewers about how a high percentage of their applicants can't write code or are in some other way incompetent for the job; I have yet to hear from one about how almost all their applicants are awesome and they wished they could hire them all. If you haven't seen the accounts about the FizzBuzz problem alone (and the hilarious failures of people online with no stress or time constraints pasting their incorrect solutions in blog comments to prove their skillz) it's worth half an hour of your time to read about it.


What if I promised you gainful employment and gave you as much time as you wanted to prepare ? Yeah most people think "I know C its just like C++/Java". A good 75% of the people who fail don't even invoke malloc once.


It's one thing if you tell them before hand "C, C, it's all about C here!" But every interview I've had has been pretty flexible in letting me choose what language to use.


To be fair, CSV is a relatively complex format with all kinds of edge cases and gotchas. In a test like situation, I'm sure 80% of the people would probably fail the test if done in any language.


Right which is why they are just asked to read in a given file and print out the contents slightly formatted, not make a full CSV parser.

Most people can't open a file and read in the contents, forget parsing the contents just reading them in is where most people fail.


I would have to guess that they would fail in pretty much language unless you provided a clean file. Parsing simple text never, ever is.


ha, try asking them to write a program to list the contents of a directory. that'll give you an insight into their coding background...


Not sure I'd still qualify K&R as an excellent introduction to C in this day and age; the book lacks focus on safety, which is so important today. It's fine for learning the language, but a language is more than its features.


Ugh. Just today I've had to clean up a problem where a function was called without enough parameters, but because the compile-and-link step included just the right components, there was neither a warning nor an error. Not only that, but because of a coincidence on the stack (the missing parameter was a NULL, which just so happened to be lying around), the relevant unit tests actually passed.

The problem with C is that, for all its simplicity, there's a ton of voodoo which you've got to "just know", and the best practices are not at all clear from what the language makes easy. In fact, the language makes it easy to make mistakes and difficult to find them.

I wish Go had made GC optional, I really do.


Guaranteed simple flow

And then there's setjmp/longjmp.

Clear and unambiguous

Aside from all the undefined and implementation-defined behavior.

A macro preprocessor

Which just does plain, dumb text substitution with no understanding of program syntax (or the environment where the expanded macro will appear!). CPP's macro system is a pretty leaky abstraction.

The other points are why I use C (when I use C), but I wouldn't make these three arguments in favor of C.


There's an old adage that a computer only does what you tell it to do. What I love about C? It really does only do what you ask it to; nothing more, nothing less.


When C does what you ask it to do, you're probably just getting lucky. It's more or less impossible to write a nontrivial C program without relying on undefined behavior.


With the exception of the translation limits (a strict reading of which may allow a compiler that can only compile a single program), it's pretty much trivial to avoid undefined behavior.

Implementation-defined behavior can be harder to avoid, but that's not a problem if you're targeting a known implementation.


Enable all warnings and fix them!


> It really does only do what you ask it to; nothing more, nothing less.

Not even assembly language has this property anymore, especially in the presence of an OS.

Which leads me to what I think is C's greatest weakness: It's no longer a thin layer over assembly language. It is so far removed from the current trends hardware design it doesn't even have functions that implement memory barriers as part of the core language, let alone any language features to help you take advantage of whatever SIMD hardware the machine possesses.

Part of this is C becoming a victim of its own success: People expect the same language to compile down to ARM and x86-64 and IBM mainframe (zSeries) and so on and so forth machine code, and do it in an efficient manner. Since low-end ARM chips don't have SIMD, C can't have SIMD, the thinking goes (even though low-end hardware doesn't have FP opcodes, either, but floating-point math is still part of the C standard).

They rely on compilers to make it fast, which is reasonable to some extent but it's also the same thing you rely on when you write in Haskell. It's contrary to the spirit of things.

I'm not advocating a return to assembly language. That would be massively stupid. However, it would be nice to have a language with concise syntax that also allows you access to the processor's status word (the carry flag, for instance).


C has memory barriers now in C11.


Right, but you would just link against the appropriate library, BLAS or whatever, that is hand-tuned for your platform and exposes a C API.


For the kind of problem where I can take that approach, I also have a python API onto BLAS, and guess which I prefer to write my programs in.


There's nothing in the "Python way" that precludes dropping into C as and when it makes sense to do so. A C programmer is able to make that decision independently.


I was just typing up a reply saying just that in more words when I saw yours; so for the sake of efficiency I'll just make my post a pointer to yours ^^


I agree with all of these comments. But... Unless you are an old-school coder who grew up with C, it is just so much easier to get stuff done with Python/Ruby or, hell, Bash/Awk/Perl. Granted, that last was for very specific tasks, but still. Want to write security code? C is is your friend, but you better be good at it, very good.


> Unless you are an old-school coder who grew up with C, it is just so much easier to get stuff done with Python/Ruby or, hell, Bash/Awk/Perl.

True, but who is going to write your Python/Ruby interpreters then? And using what language?


> True, but who is going to write your Python/Ruby interpreters then?

Pythonistas will.

> And using what language?

Using python. Case in point, Pypy.

<\ :P>


> Using python. Case in point, Pypy.

Yes, PyPy is an excellent example of doing low level tasks in a high level language. It's also quite rare. I think in the future we'll see more examples of self-hosting dynamic languages that have a JIT-compiler based execution environment that combines interpretation and compilation.

However, in comparison to the stuff that the PyPy guys are writing, C programming is easy. And I bet all of the PyPy programmers are excellent C programmers, because they have to be good with low level stuff anyway.

I still advice learning C programming, and how interpreters and compilers work.


Fair point, sir. But I don't know that those are always optimal solutions.


Oh, you are certainly on point. But how deep down the rabbit hole do you want to go? I mean, after reading "Trusting Trust", you really can't be sure of anything. Who writes your compilers? Assuming you are not specifically referring to security stuff, and more talking about a responsible community; well, I don't know. I do know that I need to get better at C. And that that was w/r/t interpreter comment, in case that implication was not obvious.


This is why (good) computer science studies generally involve everything from CPU fundamentals to operating system programming and compiler design. I'm not afraid to look inside my interpreter, operating system kernel or my C compiler.


And, look, I'm relatively new to the game, so please forgive my ignorance when it appears - although I definitely get what you are saying with this comment.


For web and system administration tasks, sure. For anything where performance is an issue, you simply won't be using an interpreted language.

C is by no means my go-to language for personal projects, but there's still a helluva lot of software out there that it would be folly to write in anything but.


One great advantage of C (which was kind of mentioned in the article) is predictability of the generated code. Compared to a C++ compilers, a C compiler is much simpler. When you write a for loop, even doing something on a struct or some fairly "complicated" object, you have a fairly good idea of the machine code generated. It's also easier to go between the assembly code and the C code generally. In some applications like email or image processing, this can be valuable.


> predictability of the generated code

To some extent, I hope the code that gets generated isn't predictable in the way you mean: I want the compiler to know all of the obscure ways to make string operations go faster by using SIMD hardware, for example. I want that kind of deep low-level knowledge to be encoded into the optimization passes because I don't know it and I likely never will. Isn't that why we have optimization passes to begin with?


What I don't like about C: it's 2012, strings should be easy. Instead they are a source of bugs and security vulnerabilities.


One thing I LOVE about C is that it's so easy to create APIs that other languages can use. For instance, if a C library exposes a good API, you can call it from lisp quite easily with CFFI. This gives you a large advantage: the low level, raw speed of a C library paired with the high-level forms of whatever language you choose.

I'm all for people writing programs/apps in C++, but I kind of cringe when I see a library in C++ because it's a harder language to wrap around.


C is what you use when you should actually be using assembler, but you are not insane enough to actually do that.


The statement "C is portable assembly" may have been true some time in the 1970's but compilers have improved since then and the statement is no longer valid.


[deleted]


> Believe me, there are a lot of cases where a human beats the compiler.

Oh, I do believe you. I do analyze and re-write assembly code from the C compiler every now and then. SIMD optimizations are especially tricky. But the compiler still wins at least 80% of the cases.

And when you add a single line of assembly code (inline or not) to your C code, you inhibit all optimizations that could take place.

So for your innermost loop it might make sense to hand write assembly, but for most optimizations just re-think your C code and/or add compiler intrinsics to get SIMD or other CPU facilities put to better use.


> when you add a single line of assembly code (inline or not) to your C code, you inhibit all optimizations that could take place.

That's a significant exaggeration with most compilers. You may inhibit some optimization which would require the compiler be able to reason about side-effects across the asm block, but that's a far cry from "all optimizations".


> compilers have improved

And hardware has changed. Remember, the C machine model (the virtual system you imagine in your head when you write C) doesn't have cache, doesn't have SIMD, and doesn't have any kind of ability to do things in parallel.


Really? Optimiser's are pretty damn good, so C has pretty much replaced assembler.


I think the most dangerous part of C is that many people assume C is perfectly efficient. It's as if C was the "100%" of performance and other languages are converging towards it as they get better.

But I don't see how this is true. Pointer aliasing is the biggest example probably:

http://www.futurechips.org/tips-for-power-coders/how-to-tric...

And then there is the huge abstraction penalty for things like `qsort()` where C++ is actually way ahead of C.


The preprocessor, and conditional compilation especially, are not something to love about C. That is probably one of C's most hateful, code-goop enabling capabilities.


C is well placed in that sweet spot close to the hardware where the abstractions are good enough to hide the hardware specifics from the programmer and good enough to build more advanced abstractions with sufficient efficiency.

Because it is so close to the lower levels of abstractions, it ended up a compact and concise language, easy to keep in mind. This (among other things) led to C being the quintessential language for some segments of "computer science": maybe not so much directly but as a reference model.


In embedded product development, C and C++ is King.


No. ...wait


I've never heard anyone informed bash C. C++ and Java, yes, but not C. Even people who've never used it tend to know that C has a domain at which (a) it excels, and (b) it's really the only credible option.

There's actually no such thing as a bad language. I enjoy language wars as much as anyone else, but languages are only good or bad relative to the type of problem they're being used for.

For example, what makes C++ and Java odious is that people try to use them for inappropriate purposes. They aren't platonically "evil" languages, and they weren't designed by stupid people; they're just inappropriate for over 90 percent of what modern programmers have to do in their professional lives. When people need high-level features and don't have them, they tend to roll their own-- badly. This is Greenspun's Tenth Rule and the heart of those god-awful "design patterns".

C++ and Java, for all that, also have domains in which they're the appropriate languages to use. They're just very small.


Here's your informed person bashing C: http://warp.povusers.org/grrr/HateC.html (Disclaimer: I like C more than C++, though I can't say I love either of them.)

I do think there are bad languages. Brainfuck is terrible. Maybe there are no bad serious languages, but you'll have to define what you mean by serious, and you'll have to argue with Dijkstra's ghost when it comes to COBOL et al. (though I think he would have liked J). (His two famous quotes against COBOL come from http://www.cs.virginia.edu/~evans/cs655/readings/ewd498.html) I don't think good/bad is really the best way to judge languages, I'm more interested in language power, fluidity, and library support.

I'd agree that every language has a purpose, even if the purpose is as simple as making a joke like in the case of LOLCODE, and that people abuse the language beyond its intended purpose. I read somewhere someone asserting "Design patterns are missing language features" and I more-or-less agree with that.


  I do think there are bad languages. Brainfuck is terrible.
Brainfuck is not terrible, it's just not easy on human eyes. Brainfuck is really a very simple 8 instruction turing machine designed to have the smallest possible compiler on the Amiga (240 bytes, at that). It is actually a derivation of p'', which is a programming language from the 60s used to describe a family of turing machines.

Now, for a truly terrible language, I introduce you the only programming language named after the eighth circle of hell in Dante's Inferno...

  @@@@@@@@@@    @@@@@@   @@@       @@@@@@@    @@@@@@   @@@        @@@@@@@@  @@@@@@@@  
  @@@@@@@@@@@  @@@@@@@@  @@@       @@@@@@@@  @@@@@@@@  @@@       @@@@@@@@@  @@@@@@@@  
  @@! @@! @@!  @@!  @@@  @@!       @@!  @@@  @@!  @@@  @@!       !@@        @@!       
  !@! !@! !@!  !@!  @!@  !@!       !@   @!@  !@!  @!@  !@!       !@!        !@!       
  @!! !!@ @!@  @!@!@!@!  @!!       @!@!@!@   @!@  !@!  @!!       !@! @!@!@  @!!!:!    
  !@!   ! !@!  !!!@!!!!  !!!       !!!@!!!!  !@!  !!!  !!!       !!! !!@!!  !!!!!:    
  !!:     !!:  !!:  !!!  !!:       !!:  !!!  !!:  !!!  !!:       :!!   !!:  !!:       
  :!:     :!:  :!:  !:!   :!:      :!:  !:!  :!:  !:!   :!:      :!:   !::  :!:       
  :::     ::   ::   :::   :: ::::   :: ::::  ::::: ::   :: ::::   ::: ::::   :: ::::  
   :      :     :   : :  : :: : :  :: : ::    : :  :   : :: : :   :: :: :   : :: ::   
                                                                                    
EsoLangs wiki entry: http://esolangs.org/wiki/Malbolge 99 Bottles: http://99-bottles-of-beer.net/language-malbolge-995.html

Writing programs in Malbolge is so difficult, typically one has to use another language to generate programs and search for a working malbolge program, or make extremely large programs that go on for pages.

You have been warned!


Maybe there are no bad serious languages...

How well do you know PHP?


Why would he have liked J? I do not see how he would describe it differently from "a mistake, carried through past perfection". Also, slightly related: he may not have seen J, but http://www.dijkstrascry.com/node/90 shows that he can not have been totally unfamiliar with APL developments.


I hadn't seen that note from him, thanks for the link. I think J solves a lot of the problems he makes of APL. It can be taught without a J interpreter, it doesn't require a special keyboard, and it offers abstraction capabilities. Also it's an answer to some positive things he talks about in this 1985 interview and elsewhere: http://www.cs.utexas.edu/users/EWD/misc/vanVlissingenIntervi... It supports both functional and imperative styles and it's deeply rooted with mathematics. It's also "hard" and isn't made for the layman developer. It encourages a very logical thought-process. I would guess he'd think Haskell was superior, but I still think he'd at least consider J in higher regards to APL even if he didn't like it.


I really flipping "hate" C. It's used for everything in Linux it seems like. Things that really ought to be high level turn out to be a libfoo.so, requiring a C compiler.

It's not a gui programming language. It's not a text-handling programming language. It's not a symbolic programming language.

But it's used as those, constantly.

Pointer-based errors are endemic to C; they ought to be an expected part of the language usage by now. There are reasons why static analysis tools such as Klocwork are out there, and are very expensive... and keep being bought. Because C is bad for an incredibly large range of tasks that it keeps getting used for. I've done some of that, and using C (or C++) made things harder and more error-prone.

It's also very good at other tasks... like writing OS kernels or drivers. I've done some of that, and C made it possible.

Although at this point, I'd be interested to give writing an OS in D a spin to see how it works.


Do you hate C? Or do you hate the way it is overused in Linux?


Mostly #2.


> I really flipping "hate" C. It's used for everything in Linux it seems like. Things that really ought to be high level turn out to be a libfoo.so, requiring a C compiler.

I'm confused by what you mean. What is this "higher level" that functionality should exist at that is above libfoo.so?


A better phrasing would be: "things that ought to be written in a highlevel language frequently turn out to be written in C".


Most of us who dislike C tend to keep quiet about it, because we know we'll never hear the end of it.


I think the difference is, most people recognize that there is a place where C is really your only choice. Maybe you need a little shim code to interact with a shared object in another language from ML or something. It's harder to argue that for C++ or Java.


> most people recognize that there is a place where C is really your only choice

But should it be?


Properly not. On the other hand, if you want to start that project today C is what you have.


In some projects, yes. For example, see the Linux kernel.


I don't know, but at the very least it is, which means you can argue it has utility.


Other than when compilers are simply not available, a rare occurrence these days (and I will argue "missing the point" if that's the only important property of the language), I am having a difficult time imagining a situation where C works but C++ would not.


However, C is not suitable for all purposes and it would be nice if people bashed it a little bit more. For example, it is a very common as an introductory CS101 first programming language here in Brazil even though it is full of traps for begginers and has some severe limits in expressive power (most notably, manual memory allocation really gets in the way)


My son's CS101 class at Penn State was based on C++ and supposedly taught OOD/OOP. But the instructor only focused on syntax and there were vast portions of the whole class system missing from the course.

I think it's probably a mistake to teach OOD/OOP in an introductory course and would rather see types, assignment, flow-control, etc. covered in a thorough manner. As it was, he ended up with big gaps in his general knowledge.


Our university used Java and it was exactly as you say. I think Java dumbs down students even more because it is a single-paradigm language.


If I was teaching a programming class, I'd probably pick Coffeescript or Python as the implementation language (Nothing against Ruby ... I'm just not expert enough to be teaching it).

My rationale is that I can easily start just by letting the students type into REPL while they're learning basic concepts like assignment and operators. Classes are there when you need them but can be ignored until the students are comfortable with both the language syntax and the basic concepts behind programming.

I hate classes where the instructor says "just do this without asking why for now" ... and I learn best when I can experiment freely and iterate quickly.


I think another problem with teaching java in university is that pretty much everything is defined in detail. I have seen students that were taught java write ocaml code that expected a particular order of evaluation of expressions (which is undefined (implementation defined) btw).


C needn't be the only credible option in its domain forever.

There is another axis of 'badness', other than suitability for the problem domain. C's main problem is features that look fine but might subtly invoke wrong behaviour - the infamous traps and pitfalls.

If you can accept this view, then there /can/ be such a thing as a bad language, and it can be possible for two languages to suit the same domain and one to be better than the other. The domain that C serves well is not going away (though it might shrink, as more suitable tools are used for things like compiler-writing), but a better langauge for the domain could come along in the end.


The conclusion to this is that in bigger projects you should use multi-language solutions... which many people don't like.


Does anybody here have an idea why? I have heard reasons ("interfacing languages is a mess" and "you'd have to learn several languages" being the main ones), but none I found compelling (especially when the alternative is C++).

If it's irrational to reject multi-language solutions (I believe it is), why do people keep doing it?


I think there is a bit of history missing here.

When the DoD started what became the Ada mandate, they had some insane number of languages and projects scattered about. I don't remember the exact number but think something like 450-500 and this was from the 1960s and 70s when they still weren't terribly digital. At the macro level, that's impossible to maintain. I think a large part of the multi-language dislike came out of that. There is a giant difference between 3-6 languages and 400, it's also harder to get good people once you add a new technology to the stack that they need to know. There are generations of developers and IT/IS guys that have been trained and warned of the dangers of polyculture. That's why people keep doing it.


Tool support is usually poor, particularly for debugging. Interfacing languages is a mess, whether obviously (if you do it by hand) or hidden away (if you use SWIG). More code to deal with and step through. Higher chance of really ugly, hard-to-diagnose side effects from making a mistake on the C side. People working in both languages will regularly introduce bugs or perf problems because they get the languages mixed up as they're working. If you have less-technical people working in the higher-level languages, they'll will work around bugs in the C side, so you never hear about them until you fix them and break everything.

I've worked on a number of mixed-language projects, and these same issues keep cropping up. Perhaps it depends on the team, but that is not something not everybody gets the chance to choose. My current mixed-language program (tcl/C++) is working out OK, with just me working on it, but tool support is poor, and hand-writing the language interfaces is annoying.

(Notable exception: Visual Studio 2010 looks to offer pretty decent mixed native/managed debugging, and the CLR has good C++ interop support.)


Another exception, mixed Python and C/C++ debugging is apparently possible in gdb these days: http://misspent.wordpress.com/2012/03/24/debugging-cc-and-cp...

I haven't had the opportunity to try it yet, I only heard about it a couple of days after the last time it would have been really really useful (naturally). But I know it's going to come up sometime.


Well well well... I've long liked the look of gdb 7 and its python scripting, and now it is looking better and better! I like it when I can retire a longstanding problem. (Luckily there's never any shortage of further problems to replace it with, so I won't run out of things to talk about.)

Now seems doubly annoying that Apple make you use lldb or crappy old gdb 6 for iOS.


It increases complexity. Your build system needs to be configured to support all the languages, you need code style guidelines for each of the languages, you need rules about when to use each language, you need to ramp up each new team member on the languages, etc. All this is overcomable of course, but it stacks the odds against using multiple languages.


"interfacing languages is a mess"

The little bit I have seen of multi language code would say that that holds, especially if the languages involved are not C and something else(What does a Python dictionary look like in Haskell? Answer: Crap unless you do a bunch of translation work). Debugging tools don't tend to work across language boundaries. etc. Maybe I gave up to quickly, but multiple languages in one executable just doesn't seem worth it. Writing components in different languages and have them communicating over some sort of shared bus, web services, whatever is simple and fairly effective.


Couple of old projects (ha sw top on linux) that I've worked the soultion was using C and Python. C for the system stuff and Python for the CLI-part. Which worked quite nicely on that division. My personal taste would have been to use Python a bit more, but the division was 'set to stone' at the start of the project and some managers were inclined to change it.


I think C being a practical joke was one of the oldest unix culture "jokes".


> Access to the hardware

Yeah, C is close to the metal. But only on a PDP-11, the assembly language of which C was conceived to integrate with.


It's always hard to C a loved one Go.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: