23

I've seen a bunch of sites that benchmark new hardware on gaming performance, zipping some files, encoding a movie, or whatever. Are there any that test the impact of new hardware (like SSDs, new CPUs, RAM speeds, or whatever) on compile and link speeds, either linux or windows?

It'd be really good to find out what mattered the most for compile speed and be able to focus on that, instead of just extrapolating from other benchmarks.

Colen
  • 536

3 Answers3

4

I did that for a while - see here and here.

At the time, I was working on GTK+ and X11 hacks for a Linux cell phone distro, and every time I touched something on such a low level, it triggered rebuilding all kinds of things. One of my colleagues never did complete builds because, on the computer the company supplied with the standard compile options, it took five hours.

I had all kinds of crazy hardware sitting around at home, so I ran benchmarks on some machines while I coded on others, and you can see the results at the links.

For what we were doing on Ubuntu, once I maxed out CPU utilization - which you can do really easily with the -j argument to make - the bottleneck seemed to be the disk.

But then the company had big layoffs, so I was out the door, and didn't finish scoping that all out. I had a lot of data and interpretation I didn't post on that blog, too.

Bob Murphy
  • 16,098
1

Tom's Hardware used to, but it looks like they stopped doing it back in 2008: http://www.tomshardware.com/charts/desktop-cpu-charts-q3-2008/benchmarks,31.html. None of the newer CPU charts include the Linux Kernel compiling test.

0

First on my wishlist is a Solid State Drive. It won't have a huge impact on compile time, but opening applications becomes drastically faster (IDE, PhotoShop, ETC). http://joelonsoftware.com/items/2009/03/27.html

The biggest factor for compile time is going to be CPU. You're pretty safe using this for the benchmark http://www.cpubenchmark.net/.

Evan
  • 682