I work better when I have some firm foundations. This helps me to solve everyday problems better, because I can generalize them. The thing I find in programming is so many things are black boxes. Logic is scientific, but commands are not. A command in a language is just "something that does something". It is very hard to work out what its logic is, without understanding the underlying hardware. I learned brainfuck for a little bit, and enjoyed it very much, but it still has commands. These commands are packages of functionality that cannot really be understood without understanding all the lower levels of logic, unfortunately. What I would like to know is if there are any mathematical or logic principles that help with being able to generalise programming problems?
3 Answers
There is a mathematical foundation to programming. It is called Lambda Calculus and is very interesting. Unfortunatley, except if you are doing some research in CS or designing a new funky language, it won't help you in your every day life.
It will help to learn and really understand languages though. It will also help you to have fun with the haskell people who like to work with things like homeomasofemtomophisms. So I would still recommend some culture in lambda calculus to anyone interested in programming.
If you haven't met them, I would recommend learning about design patterns. I wouldn't call them "science" but it provides a good basis to speak with some precision about some code without redefining everything all the time.
- 8,020
The thing I find in programming is so many things are black boxes.
Here are some resources that will open those black boxes:
According to an Amazon review:
...The real value of Code is in its explanation of technologies that have been obscured for years behind fancy user interfaces and programming environments, which, in the name of rapid application development, insulate the programmer from the machine...
This Coursera course examines key computational abstraction levels below modern high-level languages. According to the course's description:
...We will develop students’ sense of “what really happens” when software runs — and that this question can be answered at several levels of abstraction, including the hardware architecture level, the assembly level, the C programming level and the Java programming level...
And finally my favorite:
@LieRyan provided an excellent explanation from the atom all the way up to assembly language.
EDIT 1:
What's the best way to inject science into everyday programming?
There are so many arguments as to whether a Computer Scientist is really a scientist and whether a Software Engineer is really an engineer.
Anyway, the following resources will help you inject science into everyday programming:
Knuth's books are considered to be the bible for algorithms. The study of data structures and algorithms are the closest to the scientific method in the Computer Science field I could think of.
This course covers the essential information that every serious programmer needs to know about algorithms and data structures, with emphasis on applications and scientific performance analysis of Java implementations.
I also would check so called Turing machines, see http://en.wikipedia.org/wiki/Turing_machines
They are completely formal and stand on their own (meaning no lower level understanding is needed). It is quite complex theory but serves as the most low level base of programming languages and has a mathematical/formal background.
- 327