I believe C is a good language to learn the principles behind programming. What do you stand to learn in lower level languages that are "magicked" away from high level ones, such as Ruby?
7 Answers
I know C is a good language to learn the principles behind programming.
I disagree. C is absent too many features to learn principles behind programming. C's features for creating abstractions are terrible, and abstraction is one of the key principles behind programming.
If you want to learn how hardware works, and hence some mechanical sympathy for the machine, you should learn machine code, aka the instruction set architecture, and also study the cache construction of modern CPUs. Note that I am not recommending assembly language, just understand the hardware instructions, so you understand what a compiler generates.
If you want to learn programming principles, use a modern language like Java, C#, or Swift, or one of the dozens of others like Rust. Also, study the different kinds of programming paradigms, including functional.
- 34,819
There are no principles, in the general abstract sense of computer science, that are present in C that are not also present in higher-level languages. All of computer science boils down to algorithms, and all algorithms can be implemented in any language that is Turing-complete as C is.
The difference that C carries apart form higher-level languages is similar to the difference that machine code carries apart from C: The relationship of the machine to the code.
As you write code in high-level languages, you do not (generally) concern yourself with how your code interacts with the machine. The virtual machine that the language defines for itself hides many of those aspects of code execution.
In C, your program's interaction with memory is held at the forefront. It is more than merely that you have to manage your use of the heap, it includes your code's interaction with the stack, and in how your code's mere access of memory affects the behavior and performance of your code - not even the order of memory accesses can be allowed to escape your attention, as reading the wrong memory at the wrong time can effectively cripple performance.
In higher-level languages, these things are simply not as obvious. Memory is allocated and deallocated without your knowledge, and sometimes without your prompting. Most often, this is simply out of your control. The when, where, how, and why of most memory allocations are simply hidden from you.
Likewise, in the other direction, writing machine code or assembly code brings yet more details to the foreground: Almost nothing is left outside of your purview, and your code must be written aware of every allocation, every resource, every piece of data that passes through the CPU's registers - knowledge that is so far removed from high-level languages as to be arcane.
- 11,133
- 2
- 42
- 43
C and the (Abstract) Machine
Most programming languages are described in terms of abstract machines. Then, they are implemented using sets of tools like compilers, linkers, assemblers, interpreters, static analyzers, intermediate languages, and hardware that will collectively produce a result that honors at least all the expected behavior of the abstract machine, as observed by a program.
C is not an exception to the above rule. It is described in terms of an abstract machine which has no notion of your actual hardware.
As such, when people say that C teaches you how your computer actually works, what they usually mean is that C teaches you how C works. But C being so pervasive in systems programming, it's understandable that a lot of people start confusing it with the computer itself. And I would personally go as far as to say that knowing how C works is often more important than knowing how the computer itself works.
But still, C and the computer are different things. Actual hardware is actually complicated -- in ways that make the C spec read like a children's book. If you're interested in how your hardware works, you can always look up a manual and start writing code in an assembler. Or you can always start learning about digital circuits so that you can design some hardware on your own. (At the very least, you will appreciate how high-level C is.)
How do you learn? And how do you learn?
Okay, actually learning about hardware involves things other than C. But can C teach anything else to programmers today?
I think it depends.
- There are some who would say that you can better learn a concept by working in an environment that offers it and encourages it, often in multiple ways.
- There are some who would say that you can better learn a concept by working in an environment that doesn't offer it and where you instead have to build it yourself.
Don't be too quick to choose one of these possibilities. I've been writing code for many years and I still have no idea which one is the correct answer, or whether the correct answer is just one of the two, or whether there is such a thing as a correct answer on this issue.
I'm slightly inclined to believe that you should probably eventually apply both options, loosely in the order I described them in. But I don't think this is really a technical matter, I think it's a mostly educational one. Every person seems to learn in significantly different ways.
Exclusively in C
If you answered the above question by at least involving the second option I proposed, then you already have a few answers under your belt: anything that can be learned in higher-level languages can be better learned by re-inventing it in C or at least expanded by adding C to the mix.
But, regardless of your answer, there are certainly a few things that you can learn almost exclusively from C (and perhaps a handful of other languages).
C is historically important. It is a milestone that you can look at and appreciate where we came from and maybe get a bit more context about where we're going. You can appreciate why certain limitations exist and you can appreciate that certain limitations have been lifted.
C can teach you to work in unsafe environments. In other words, it can train you to watch your back when the language (any language) can't or won't do it for you, for any reason. This can make you a better programmer even in safe environments because you'll produce fewer bugs on your own and because you will be able to turn off safety temporarily in order to squeeze some extra speed out of your otherwise safe program (e.g. use of pointers in C#), in those cases where safety comes with a runtime cost.
C can teach you that every object has storage requirements, a memory layout, the fact that memory can be accessed through a finite address space, and so on. While other languages don't need your attention on these matters, there are a few cases where some acquired intuition can help you make more informed decisions.
C can teach you about the details of linkage and object files and other intricacies through its build system. This can give you a useful hands-on understanding on how a natively compiled program often goes from source code to execution.
C can bend your mind to think in novel ways through the concept of undefined behavior. Undefined behavior is one of my favorite concepts in software development, because the study of its implications on non-classical compilers is a unique mental exercise that you can't quite get from other languages. However, you'll have to reject trial-and-error and start studying the language in a careful and deliberate way before you can fully appreciate this aspect.
But perhaps the most important realization that C can grant you, being a small language, is the idea that all programming boils down to data and operations. You might like to look at things as modular classes with hierarchies and interfaces with virtual dispatch, or elegant immutable values that are operated on using pure mathematical functions. And that's all fine -- but C will remind you that it's all just data + operations. It's a useful mindset because it allows you to bring down quite a few mental barriers.
The reason why C is good for learning is not that it teaches any principles. It teaches you, how things work.
C can be compared with one of those good old cars from the 70s or 80s, which were just built to drive. You can tear them apart, screw by screw, and understand how each part works, and how it works together with the other parts that you can take in your hands to look at. Once you understand all the parts, you have a very clear picture how the whole operates.
Modern languages are more like a modern car where the engine is basically a black box, way too complex to be understood by the average car owner. Those cars can do a lot, heck, in these days the are actively learning to drive by themselves. And with that complexity and comfort, the user interface has been moved much further away from what is actually going on in the engine.
When you learn programming in C, you come into contact with a lot of the screws and nuts that a computer is made of. This allows you to develop an understanding of the machine itself. For instance, it allows you to understand why it is not a good idea to build a long string like this:
java.lang.String result = "";
for(int i = 0; i < components.size; i++) {
result = result + components[i];
};
(I hope, this is correct Java, I haven't used it in a while...) From this code example, it is not obvious why the loop has quadratic complexity. But that is the case, and it is the reason why this code will come to a grinding halt when you have a few millions of small components to concatenate. The experienced C programmer knows instantly where the problem is, and will likely avoid writing such code in the first place.
There are better languages than C to learn "the principles behind programming", especially theoretical principles, but C may be good to learn some practical, important things about the craftwork. greyfade's answer is surelely correct, but IMHO there is more you can learn from C than how to manage memory by yourself. For example,
how to create programs with a complete error handling in the absence of exceptions
how to create structure in a program without having any language support for object orientation
how to deal with data in the absence of data structures like dynamic sizeable lists, dictionaries or a useful string abstraction
how to avoid common errors like array overflows even when the compiler or run time environment does not warn you automatically
how to create generic solutions without having language support for templates or generics
and did I mention you can learn how to manage memory by yourself, of course? ;-)
Moreover, by learning C, you will learn where the syntactical commonalities of C++, Java, C#, Objective C come from.
In 2005, Joel Spolsky wrote a recommendation for learning C before any other higher level language. His arguments are
"you'll never be able to create efficient code in higher level languages."
"You'll never be able to work on compilers and operating systems, which are some of the best programming jobs around."
"You'll never be trusted to create architectures for large scale projects"
"if you can't explain why
while(*s++ = *t++);copies a string, or if that isn't the most natural thing in the world to you, well, you're programming based on superstition
Of course, what he wrote is surely debateable, but IMHO lots of his arguments are still valid today.
- 218,378
The two basic abstractions of computing are Turing machines and Lambda calculus, and C is a way to experiment with the Turing machine view of computation: mostly a succession of low-level actions from which emerges a desirable result. But you have to keep in mind that C comes with its own model of computation. So, learning C will teach you the low-level details of the C abstract machine, which is getting quite different from actual architectures. One of the first thing I was taught in C was to never try to outsmart the compiler by applying "clever" tricks, and it seems that the trend is towards more and more optimizations in compilers. Like in other high-level languages, when you write in C, compilers sort of understand what you expect to be done on the abstract C machine and make it happen on the actual hardware, sometimes in very unexpected ways (reordering statements, etc.).
So what I mean is that learning C is not necessarily going to give you a good picture of what actually happens in the hardware. "C is close to the machine" should be understood as "closer than most higher-level languages". Learning software architecture directly is going to be more rewarding if you want a more accurate picture of "how it works".
On the other hand, learning C can familiarize you with system programming, low-level types, pointers and in particular memory allocation. As for learning algorithms and data structure, I don't there is an advantage to learn them in C rather than in other languages.
- 6,015
It is very open ended question, may not yeild conclusive answer. You can learn pretty much everything in every language provided you understand the internals of the language framework. C forces you to understand lower details because it was primarily designed to write an operating system. Even after so many years, it is still one of most used languages primarily thanks to embedded systems and open source projects. So, the question is what you want to learn ? And in which domain?