287

I must be missing something.

The cost of employing a programmer in my area is $50 to $100 an hour. A top end machine is only $3,000, so the cost of buying a truly great computer every three years comes to $0.50/hour. ($3000/(150 wks * 40 hours))

Do you need a top-end machine? No, the $3000 here is to represent the most that could possibly be spent not the amount that I would expect. That's roughly the cost of a top-end iMac or MacBook (17 inch).

So suppose you can save $2000 every three years by buying cheaper computers, and your average developer is making $60. (These are the most charitable numbers that I can offer the bean-counters. If you only save $1000, or $750, it only strengthens my case.) If those cheaper computers only cost you 10 minutes of productivity a day. (Not at all a stretch, I'm sure that my machine costs me more than that.) then over 3 years the 125 lost hours would add up to a loss of $7500. A loss of 1 minute a day ($750) would give a net gain of $1250, which would hardly offset the cost of poor morale.

Is this a case of "penny-wise and pound-foolish" or have I oversimplified the question? Why isn't there universal agreement (even in the 'enterprise') that software developers should have great hardware?

Edit: I should clarify that I'm not talking about a desire for screaming fast performance that would make my friends envious, and/or a SSD. I'm talking about machines with too little RAM to handle their regular workload, which leads to freezing, rebooting, and (no exaggeration) approximately 20 minutes to boot and open the typical applications on a normal Monday. (I don't shut down except for weekends.)

I'm actually slated to get a new machine soon, and it will improve things somewhat. (I'll be going from 2GB to 3GB RAM, here in 2011.) But since the new machine is mediocre by current standards, it is reasonable to expect that it will also be unacceptable before its retirement date.

Wait! before you answer or comment:

  1. $3000 doesn't matter. If the machine you want costs less than that, that's all the more reason that it should have been purchased.
  2. I'm not asking for more frequent upgrades. Just better hardware on the same schedule. So there is no hidden cost of installation, etc.
  3. Please don't discuss the difference between bleeding edge hardware and very good hardware. I'm lobbying for very good hardware, as in a machine that is, at worst, one of the best machines made three years ago.
  4. $50 - $100 / hour is an estimate of employment cost - not salary. If you work as a contractor it would be the billing rate the contracting agency uses which includes their expenses and profit, the employers Social Sec. contribution, employers health care contribution etc. Please don't comment on this number unless you know it to be unrealistic.
  5. Make sure you are providing new content. Read all answers before providing another one.
Eric Wilson
  • 12,111

39 Answers39

224

Many companies are certifiably insane around this.

Seriously. If you asked 10,000 tech mangers, "Let's say you paid Danica Patrick $100,000,000. Do you think she could win the Indianapolis 500 by riding a bicycle?", I'm sure not one of them would say, "Yes."

And yet a good percentage of these same managers seem to think that highly-paid software developers ought to be just as productive with crappy tools and working conditions as they are with good ones - because, of course, those lazy, feckless programmers are getting paid lots of money and ought to be able to pedal that bicycle faster.

Now, what exactly good tools and working conditions consist of depends on the job to be done. People who code the Linux kernel need different kinds of hardware than web site designers. But if the company can afford it, it's crazy not to get people what they need to be as productive as possible.

One company I worked for had a 9 GB source code base, primarily in C, and the thing we most needed were fast builds. Unfortunately, we were mostly working with hardware that had been mediocre five years before, so people were understandably reluctant to build much other than what they were working on at the moment, and that took its toll via low productivity, quality problems, and broken builds. The company had money to upgrade the hardware, but was strangely stingy about it. They went out of business last summer after blowing through over $100 million because their two biggest clients dropped them after repeatedly missed deadlines. We were asked one time to suggest ways to improve productivity; I presented the same kind of cost-benefit analysis the OP did. It was rejected because management said, "This must be wrong - we can't possibly be that stupid", but the numbers didn't lie.

Another company I worked for had fine computers for the programmers, but insisted everybody work at little tiny desks in a big crowded bullpen with no partitions. That was a problem because a lot of us were working with delicate prototype hardware. There was little room to put it on our desks, and people would walk by, brush it, and knock it on the floor. They also blew through $47 million in VC money and had nothing to show for it.

I'm not saying bad tools and working conditions alone killed those companies. But I am saying paying somebody a lot of money and then expecting them to be productive with bad tools and working conditions is a "canary in the coal mine" for a basically irrational approach to business that's likely to end in tears.


In my experience, the single biggest productivity killer for programmers is getting distracted. For people like me who work mainly with compiled languages, a huge temptation for that is slow builds.

When I hit the "build and run" button, if I know I'll be testing in five seconds, I can zone out. If I know it will be five minutes, I can set myself a timer and do something else, and when the timer goes off I can start testing.

But somewhere in the middle is the evil ditch of boredom-leading-to-time-wasting-activities, like reading blogs and P.SE. At the rates I charge as a consultant, it's worth it for me to throw money at hardware with prodigious specs to keep me out of that ditch. And I daresay it would be worth it for a lot of companies, too. It's just human nature, and I find it much more useful to accept and adapt to normal weaknesses common to all primates than to expect superhuman self-control.

Bob Murphy
  • 16,098
170

I would suggest that, in reality, one cost is visible and quantifiable, while the other cost is neither.

If failing to upgrade the hardware bleeds even as much as $1000 per developer per week from the budget, no one outside (read: above) the tech department ever sees that. Work still gets done, just at a slower rate. Even in the tech department, calculating that figure is based on numerous unprovable assumptions.

But if a development manager asks for $3000 per developer, particularly in a company with 50+ developers, then this takes a lot of justification. How does he do that?

pdr
  • 53,768
94

I will put my 2 cents in here from the employer's side ... who is also a developer.

I agree that low end machines are useless but top end machines are overkill.

There are a number of reasons why you don't get the top end machines:

  1. Cashflow is a real issue, not just a theory. You might be getting paid $60K-$80K per year, but this month we have a total amount in the bank which has to be split amongst every competing thing in that month.
  2. There is a sliding scale of price and benefit. Low end machines are on the whole pretty useless ... if you're getting a celeron or low power chip then whinge away ... mid range machines have good overall performance, once you get into the top you are starting to tune for a given purpose (CAD, Gaming, Video encoding etc) ... and the tuning costs extra.
  3. General parts are generally cheaper, replacements, warranties and insurance all play a part in the overall running costs and the down time while you source a replacement.
  4. Top end machines depreciate just faster than ones 1/3 the price.
  5. If you're doing high end graphics programming or CAD work then the extra grunt is valid; if you're just writing standard business software, running visual studio or eclipse and surfing Stackoverflow for answers then the extra power is cool bragging rights, but realistically a mid range machine will not max out the CPU or memory in a standard box today.
  6. Mid range machines built today hammer and in 2 years time they will be twice as fast (well kind of). Seriously, they are lighting quick.
  7. At the end of the day most of what you do is type raw text into text files and send it to the compiler ... that bit really hasn't changed since VI in the 1970s and the low end machines today are a millions times faster than the ones back then ... your pace of coding really isn't that different.

SO to summarize, you should have good gear and good tooling, it makes a big difference but top end machines are not really justifiable for the "general developer".

... ah, and now I read you edit and that is what you are talking about, I will leave the above cos I have written it now ... Yeah, your machine is underspecced for the tooling.

To clarify a mid range machine should have

  • 2 cores min, 4 cores good anymore at this stage is overkill.
  • 4GB is a min, 8GB is good and anymore is nice to have.
  • SSD should be standard but really a 10KRPM WD or seagate 80-100GB drive should do fine.
  • 2 x 19" monitors is a minimum with a reasonable video card.
Manbeardo
  • 103
Robin Vessey
  • 1,587
56

The difference of productivity between the "top-end" machines and "almost top-end" machines is negligible. The difference in price is significant.

Not to mention the IT support for different machines instead of having all the developers using the same HW and SW images (which you can't do if you're buying a top-end machine for every new hire, the top-end will be different every time). Also, people who got the last year's top-end will want to upgrade because that newbie next cube has a "better" machine than them, and they're oh so much more important, aren't they?

Unless you really need the top-end machine for your work, I see no reason why to throw away the money.

oh whatever
  • 4,734
27

Because most employers do not understand how developers think, act or work. Or, how top tools can save the company money while increasing productivity. This leads to the loss of a point on the Joel Test, failure to provide "the best tools money can buy". This also leads to loss in productivity and job satisfaction. Thats just the way it is. Maybe one day you can start your own company and score 13/13. Until then, ask questions up front with your employer so you know what to expect before ever taking the job.

As far as your current situation, if you feel they listen and trust you then bring up the discussion. See if they will give you an upgrade. I know I'd work a little bit longer if I had a top of the line rig with dual 50" monitors to work with. Stick me in the matrix.

Same reason people want a Mercedes CLS when a Toyota Camry gets you there just the same. Sure, you may only squeeze a few more seconds of compile time out with a new machine, but appearances do matter.

P.Brian.Mackey
  • 11,121
  • 8
  • 53
  • 88
12

Your math doesn't seem to include the time required to manage the constant flow of hardware into and out of the company -- it would take an extra IT guy or two depending on the size of your company, so tack another $50-$100k/year on top of your numbers. Plus, you lose productivity on the day they swap your computer out. If they skimp on dedicated IT staff you'll have to do the backups and restores yourself, possibly losing a day or two in the process. In other words, I think it's a bit more complicated than you think it is.

Bryan Oakley
  • 25,479
9

One problem with your argument is cashflow. If they don't have the money, the point is moot. The other is return on investment.

This may not apply to the companies where you've worked. Some companies are highly leveraged and/or cash poor. They would rather spend the savings you describe on something that will sell more widgets or software. You have to show that your gain in production outweighs an equal investment in other areas.

If a software company is in maintenance mode and needs more sales, there may be a better return on spending the money on sales and marketing.

I think you need to address the fact that in your case, the money is better spent on a programmer, than another area of the company.

Be careful with this argument if you're on salary. They'll just want you to work harder to make up the difference ;)

Neil N
  • 612
JeffO
  • 36,956
8

I made this argument at my work for switching from laptops to desktops. I said everyone should be on a desktop and if they need a computer at home - get them one there too.

The speed advantages of a good computer are not negligible, especially if you remove crashes from really old hardware.

Concerning "top of the line" and "near top of the line" - I would argue near top of the line is always where you should be. At "near top of the line" you can upgrade every 2 years instead of 3 and end up with better hardware on average.

I recommended cyberpowerpc.com and my company let me purchase a PC from them (marketing guy), but they bought all the programmers pcs from Dell because the support was worth the extra cost. Think about that... its 1.5-2x to buy a PC from Dell, but you all appreciate if the PC goes down and you can't fix it fast you lose money.

A slow PC is like a broken PC you aren't repairing.

6

There's also a question of budgets - usually developers are paid out of a different budget than hardware for said developers, and their might simply not be enough money available in the hardware budget.

Timo Geusch
  • 2,773
6

First, to answer the question asked:

They can't do the Math or if they do, they somehow believe that it doesn't apply to them. Budget and accounting for hardware and personnel are separate. People in decision-making position never heard of the issue and are totally unaware that a problem exists at all.

Now, to the real question: "How do I handle this situation?"

It's essentially a communication problem. You explain the problem and the interlocutor hears "bla bla bla we want shinny new toys". They just don't get it.

If I were in your position, I would make a quick video titled "Can we afford old computers?": Stills of a typical workstation. On the right side, a blank area titled "cost".

Still of the power button. Below: "Starting the computer. 20 minutes". In the blank area, "Starting the computer = $40". "Opening IDE = $5", "Computer freeze = $80", "building the product = $600"

Run through at a quick pace and keep adding the numbers then compare with the cost of a new computer and don't forget to end with "This video was produced at home on a $500 store-bought laptop that outperforms all the "professional" development machines currently available.

If you are concerned that raising the issue will cause problems for you, you could also just bring in your own laptop to work.

If there is no way to get that issue across, then perhaps you should consider finding another job.

Sylver
  • 121
4

Discounts play a big part in the buying process as well.

Spit ball (not real numbers): 100 machines @ 1000 w/ 15% discount = 85,000

90 machines @ 1000 w/ 10% discount = 81,000 + 10 machines @ 2000 w/ 5% discount = 19,000 => 100,000

As has been already mentioned, the extra cost in supporting the "special" machines needs to be added in the mix.

bart
  • 91
4

Personally I have always had at least an OK development computer when I worked for a 'small' company but when it comes to Big companies, programmers are a dime a dozen compared to a project manager having a budget.

Specially if he/she is one of those having great ideas, read: budget approved.

Whatever the 'good' idea, that person will need really good programmers to actually implement the " New 'better' product" so they will pay the programmer the price needed.

Getting the new development computer, as far as I have been concerned, does not goes through the same 'department' as the other budget though so do expect working under bad conditions if you are paid well :-) My last work: Dell E5xxx + One LCD 1280x1024 ...

Valmond
  • 521
3

I was asked to spec out the machine I wanted to use here, within a fairly tight budget. I managed to come up with a halfway decent system that works despite not being perk heavy.

I was originally thinking along the same direction as the OP here, the time I sit here waiting for compiles or loads is money out the window. As I've been moving along I also recognize that the time I spend going to get a coffee, or walking to the printer are also money out the window.

Rather than worry about the small amounts of time that I do have to wait, because we went with a less expensive development system, I've looked at my own habits and improved the larger amounts of time I spend doing nothing particularly useful (ahem... stackexchange is useful, and productive to boot, and I'm sticking to it!! :-) ) Of course we need breaks, but this is time other than "breaks".

So in a way, in a general sense, this question could be the "premature optimization" of work efficiency. Many great points about migration costs, loosing out on volume purchasing, etc.

In your particular situation, where you are losing time on the order of a break in order to reboot/open programs, yes, it makes a lot of sense to upgrade to decent equipment as your productivity is seriously impaired, a halfway decent i3 system with 4 GB RAM is on the order of $500 ... I'm sure it won't take long to recoup that cost.

jm01
  • 101
  • 1
  • 5
Stephen
  • 2,141
  • 1
  • 14
  • 24
3

Buying new hardware involves money, money involves decision makers and usually they're not developers if your company is big enough. Of course we have exceptions...

As @Rob explained, there is a lot of reason why you'll not get the best hardware. Your company may have a policy defining what kind of hardware is bought, as always with bureaucracy it's hard to have a bleeding-edge policy. Many managers won't bother adapting it to your personal needs, etc.

Poor communication, risk aversion and other flaws:

Let's consider you have reaaally crappy hardware, it's no longer possible to work in these conditions and you want to do something about this.

Now you have to go convince your manager. Well, usually you'll have to convince your project manager who tells your manager who reports to his boss and you'll need to make sure that that guy really understands your issues.
Involves communication skills and the technical understanding of the management.

Second step, if you're lucky enough, the management will think about it. What do they get ?

  • You'll work faster with some uncertainties (they don't directly get money as you'll try to explain).
  • It'll cost money, now.

That means they'll have to trade money, and their actual planning of your work, for an eventual opportunity to let you do something else in the future and that, that's an investment but also a risk.
Sadly, many managers are risk-averse. Not to mention that the poorer their understanding of your issue, the riskier it appears. Some may also have a hard time recognizing that someone did not buy the suited hardware in the first place.

Moreover, management usually has a shorter definition of what long term means. If they're asked to do some sort of monthly budget optimization, they may even have direct financial incentives not to buy you new hardware! And they won't care about the two weeks you may save six months later..

Of course you don't always have to wait so long when you can do wonderful stuff in one day!

That works better if you have smart and open-minded managers who listen, understand your issues, are ready to take reasonable risks and trust you enough to let you explore creative ways to use the freed time.

That's not always the case: I waited 3 months to get a graphic card to connect my second screen while being forbidden to buy it myself (30€), lost 3 days for not having an extra 500GB HDD, regularly had to wait several hours when preparing data for the client because of the slow 100Mbps network. After asking several times for 2GB of ram, I was told to buy it myself and stop bothering the management with those technical issues. And we where doing scientific computing for a big industrial client who was ready to pay the price..

Maxime R.
  • 101
3

Math aside, all of your users are not likely to have top end machines. Developing on a machine that is spec'ed more closely to something that is average in price will acquaint the developer more closely with the experience (and pains!) of their users.

Your QA department may have a min-spec machine, but how often is it used? Developing on a machine that is a realistic target environment exposes issues early on (unresponsiveness, poor performance, race conditions because of that slow performance, etc), which drives teams to fixing them sooner.

3

One big factor is the kind of bloatware that the IT in a typical big company tends to put on the laptop. If you have a Windows 7 machine at home and just some antivirus, a standard SSD-3GB-Quad-core system will boot up in less than 10 seconds. Compare that to the bloatware my company puts in, and it takes forever to boot. I have seen some folks using zapping the OS completely and installing their own to speed things up. I think that solves a problem to an extent, although it is a huge InfoSec violation. But seriously - 10 minutes?!

3

In large corporate organisations the choice of hardware is pre-defined and locked down due to the fact that such organisations have fixed, centrally managed desktop and laptop specifications and configurations. The specifications for these will have been dictated overwhelmingly by a combination of "procurement" and "support" considerations. The company I am currently working at, for example, has over a 100,000 employees and they work on the basis that "one size" fits all, and that size will have been primarily driven by commercials. Once such policies are in place, they are locked down because support services usually invest a considerable amount of time in testing and deploying the software to that "standard" machine specification. Arguments around "developer" productivity, in such environments, simply fall on deaf ears; production services are not going to make an exception for a small group on the basis that they may be more productive; if they did so, they would quickly be swamped with requests for deviations, and in any event they (production support) are incentivised to keep the support cost as low as possible. > 1 desktop/laptop configuration increases the support cost. In an organisation where the primary "product" is the result of software engineering, such arguments are invalid, but the reality is that most organisations are NOT, and the key driver is keeping support costs low.

2

Simply because, best hardware does not make 'best' developers! That being said, the company is to blame if it is hindering the work of the programmer.

However, if the hardware is sufficient for the developer to work, then he has nothing to complain about.

Also, no point in having the 'best' hardware and using only an IDE to code - waste of resources that way.

Sterex
  • 101
2

"We have met the enemy and he is us." - Pogo

Either way you slice this question - the collective group "programmers" bears direct responsibility for any failure to buy the best tools in the workplace.

  1. Business finance is incredibly complicated with numerous conflicting motivations and levers. Without concrete knowledge of what your finance department is currently tracking (tax avoidance, managing quarterly expenses, driving up future capital expenses, maximizing EBITDA or whatever else is on their radar), any discussion of true costs is irrelevant. How would you react to a marketing person bugging you about compiler optimizations for code you know is about to be transitioned to an interpreted language? If programmers can not demonstrate in specific terms how the tools they have don't contribute directly to the bottom line, the business is correct to spend as little as possible. We also have to learn to listen to business finance so we can understand the realities facing resource allocation.

  2. We as a group vote with our presence in the workplace far louder than asking for better tools, submitting the most awesome white paper to our managers, or even posting on the internet. There are organizations that have created a culture of ensuring its employees either have the tools they justifiably need or understand the case as to why not at the moment. Until competitive pressure requires this from the majority of employers, we can only vote by seeking out employers we believe in.

Each of us has to either make this something that matters to the core, or let it go.

bmike
  • 182
2

I used to be a developer at a large company and then a startup. Here are my two cents:

  1. 8GB DDR3 DIMM (2x$4GB) costs $50-$55 today (Circa july 2011)
  2. 21" LCD Monitor costs $200 (circa july 2011)

If your company allows you to bring your own equipment, just use your own $ and upgrade the RAM and LCD monitor. Why you ask?

  • isn't your own productivity something you value?
  • aren't your eyes worth $200?

You can always take the monitor with you when you quit the job (remember to clearly label it as your personal property). I've done the above recipe (upgrading RAM and using my own LCD monitor) in both my previous jobs - and my current job.

2

I don't see how you can group all employers together in one basket. I've worked for a few employers as an employee and as a consultant and always got hardware that was more than sufficient for my needs - for current job I was handed a bright shiny new HP quad core with 4 gb ram and Win64 on the first day - not top of the line, but very sufficient - (I use Delphi XE and XMLSpy as my main development tools) - in fact so nice I went and bought the same machine for myself at home. (Maybe I'm not all that productive! LOL.)

If you don't get good hardware, try asking for it - and if you feel you can't ask for it, you're probably not working at the right place because they don't view developers as a resource, but as a liability.

So I guess the answer to your question is: those companies that don't and/or refuse to provide sufficient hardware for a developer are companies that consider their developers a liability - jobs they'd rather outsource and not deal with at all.

Vector
  • 3,241
2

CFO side.

The company has a lot of expenses. Every department needs more $ in order to do better and in every department the expense is a must.

when you come to choose the best way to use available $ you take into account:

  • how much do they need? smaller sums are easier to approve.
  • will it increase sales? better pc's usually don't contribute directly to increase of sales
  • does the department like to spend $ or do they understand cash flow. Most r&d departments I have seen have an arrogant "we deserve the best" approach. This is understandable as they earn a lot of $ and when you do you think you deserve the better things in life. $ needs of r&d teams usually give a feeling of a spoiled child requesting more toys while his parents are struggling. "A delicate genius".

The 10min a day waste is not a reasoning that would work with most finance departments. Most r&d teams waste a lot more on all the none programming activities they enjoy during the day. Lets chart all the waste in your department and see what can be done to improve productivity.

1

Simply put, purchasing decisions are often made by bean counters (Accountants, and middle managers) rather than by project managers.

Lots of people have given potential reasons, and all of them are a factor in one situation or another, so there isn't any single overriding situation. Buying large scale equipment may mean they lose some money on productivity for programmers, but gain money in other areas.

Still, it often just comes down to a budget. You have to fit in the budget, and that's all there is to it.

1

I used to work for a networking company where they upgraded ram from 512 MB to 1 GB last year. We were working with f**king CRT monitors in 2010. The funniest part was the managers' hardware was upgraded to 2 GB ram. Why on earth would anyone want 2 GB to create damn PPTs and how someone would develop applications with 1 GB ram, I would never know.

1

It comes down to who handles the money. In larger organization IT is given a budget of say $1M for the year. That includes support salaries, servers, etc. They have to spread it around between all their resources. They cut deals with vendors like Dell or IBM to get x number of the same kind of computer. This they give to everyone from customer support to the programmers. They also get deals on support etc., when they only have to maintain a limited set of models. They are not programmers either, I have had numerous arguments with non-programmers about computers. When I went over my IT managers head for some new HD one time, the CEO said buy them and boom, everybody finally had enough disk space to run virtual machines.

I actually blew up and cussed out my boss because IT was going to take away my 19" second monitor because I had a laptop. They stiffed me on that too, giving me a 13" model when others were getting 15". That goes back to politics in IT which is another problem. It's kind of an us vs. them thinking sometimes.

Bill Leeper
  • 4,115
  • 17
  • 20
1

From the perspective described by the asker, the question makes complete sense. However there are more costs involved with keeping hardware current.

Here are some of the costs that also need to be considered:

  • requisition cost (research and details that goes into purchasing)
  • installation & configuration cost
  • support & maintenance cost
  • software licensing cost
  • disposal / upgrade cost

In some cases, these can be 2-5x greater than the cost of the hardware itself. Even more if there is sophisticated software licensing involved.

In general the scale of these costs depends on the size of the company or the complexity of the organizational structure. Smaller teams with direct access to purchasing power can keep these costs low, whereas in a larger organization these costs can get very high.

Joshua
  • 103
1

Because a lot of companies outside of typical tech start-ups are not interested in hiring rock-stars. They're investing in someone who can just do work. So if they don't care how you do you work as long as you do it why should they care what equipment you use? I've worked at places that still use 15-inch CRTs and everyone does just fine. Sometimes when i read questions like this I wonder if people realize that not everybody in the world works for a cool start-up.

Sergei
  • 161
1

I've worked for companies that skimped on hardware in the past. It sucks, and if they need convincing the battle is likely to be a never-ending one.

Turns out that companies committed to using the best available tools are rare, but they do exist; I work for one. I've got a quad-core 17" 2011 MBP, 8GB RAM, Vertex 3 SSD, 2 x 24" external monitors, plus a quad-core desktop and a 4GB Xen slice; as well as quiet offices.

Could I get by with lesser hardware? Sure. But I think we'd all rather be bragging than bitching.

1

In my opinion, there are only two defensible objections a company could raise to keeping developers set up with solid workstations. The first is that they are undergoing a cash crisis. That better be short lived, or the company will not be a going concern for long. If you work for a company like that, you should keep your resume up to date.

The other is that their organization is simply not bottle-necked on software development capacity. That is, an increase in the quality or speed of software development output would not improve the bottom line. If the company's main business is selling software, that will be practically impossible. If software isn't their main business, and they aren't bottle-necked on it, they should be trying to reduce their software workforce by transferring or letting go of their weakest team members. Supplying poor equipment will reduce the size of their team from the opposite end, I'm afraid.

0

I must have missed the author's perspective.

First, Google as one example was founded using cheap, "disposable" hard drives attached to older servers run as a farm. OK that might be hyperbole, yet see: http://en.wikipedia.org/wiki/Google_platform#Original_hardware

Second, it doesn't take much CPU or graphics resource to run gvim. So maybe your choice of development environment is the problem.

Third, there are dozens if not hundreds of CPU-intensity-reducing ways to enhance productivity which have little to do with whether or not you have 2 gig of RAM or 3 gig of RAM. Watch an average programmer over their shoulder to see this: for example, using a lightweight PDF reader vs. Adobe suite for the documentation; using a minimal installation of a VM for testing apps rather than a full install; removing all those startup daemons bundled with Win DELL machines (using regedit); using a lightweight browser to webmail instead of keeping outlook running; not opening 50 gazillion tabs in firefox chasing solutions to MSFT implementation issues on the web; etc etc etc.. So this point boils down to the following: Prove you need more Memory and Mhz to solve this software design problem faster.

0

New machines, newer technologies mean newer problems. Not everyone at every company is a techwiz and not every company has the IT resources to train people and handle problems 24/7.

Yes, perhaps if you're a freelance programmer working on your own personal desktop it would be worth blowing $1000 on a rig to squeeze out 10 min of extra productivity everyday. However when you're deploying hundreds of these machines to people who may lose productivity because of new equipment, the prospect seems a little more grim.

tskuzzy
  • 752
0

Once I tried to argue for the company (largish) to buy us developers decent consumer grade systems. Essentially the performance specs on them were comparable to the Enterprisey version but at 1/2 the price. My argument was at these prices they were essentially a throwaway so if it broke just buy a new one (on the assumption that better than 75% would last 24 months). I suggested that in exchange for getting one of these laptops the developer would have to sign an agreement (or something) that he/she would be responsible for the SW load/configuration and help desk would not 'help' fix it.

It didn't fly but I thought the basic premise of the argument was reasonable, considering we did windows dev and all of us were local admins.

0

Why not? Because it's not accountable. We can't precisely match each hour of work with a profit margin.

A simple solution for this would be refunding whoever pays for his own machine upgrades. If your counting is any true, it should be easy to prove your own profit from production improvement by comparing the past 2 periods (week, month, semester, year or whatever) in the exact same job / project.

If developers were able to quantify how much they are generating over a period, the issue disappears. Most developers can't. Nor their managers and even less the finance folks. Because the job is very subjective.

But if you can somehow show those numbers (I know I can't), then you're all set for your cost-effective-non-self-booting-dream-machine already!

cregox
  • 717
0

New programs run great - on the developer's computer. Buy a developer a 4 GHz 8 core box and the application he creates will run fine - on any 4 GHz 8 core computer. But on a typical customer's computer with 2 GHz and 1 core it runs like a dead snail.

Developers naturally keep adding features and code and levels of indirection until things slow down, on the development machines. If you're only developing for brand new hardware, then buy the latest. But it's a danger if you sell software to people with existing hardware.

A developer's computer should be about the same power level as the target customer's computer, with perhaps a bit extra for the debugger. But no faster.

0

2GB on a developer machine is obviously shameful, however, solving this problem should not cost $3000…more like $100 (conservatively). Why make the case to upgrade everything all at once? Smart IT departments are continuously upgrading machines over their lifetime. Eventually you need an entire new machine, but your machine is not running hardware specs for Windows 95; it could be upgraded for $300-$500 into a typical mid-range machine, and these upgrades could happen over several months so there is not a cash flow problem. You probably do not need a new graphics card, sound card, USB ports, DVD writer, etc., so why pay for them now? It’s like buying a new car because your AC is broken.

0

I think the "right" tools are required for the right jobs. If you don't have the "right" tools (hardware, software, or otherwise) I believe it is due a misunderstanding or miscommunication of the expectations between an employee and their bosses. This is both the developers and the company's responsibility. The higher the expectations the closer the "requirement" should be looked at.

This being said I know several developers who "need" 8 GB of RAM for their machine when I've made due with less in more trying scenarios. But again I think it's understanding requirements.

Steve
  • 1
0

At my current company, developers are pretty high on the totem pole for hardware. I imagine that hardware is put on a normal company budget just like anything else, and the need outweighs the want.

In my opinion, a developer should be responsible for their own hardware, but that depends entirely on the situation. If you are asked to write a simple app for a simple website, than you might not need a sophisticated piece of equipment to sit in a text editor. On the other hand, if you are into contract programming and want to do some side gigs you may want to consider buying your own hardware and base software, and having the company purchase individual API licenses as needed by that specific company.

Either way, it is all a matter of checks and balances, and if you are concerned with productivity than your dollars are probably best spent in monitoring how much code a developer is putting out for their time. If it takes them 10 hours to do one project and 5 hours to do a similar project, it may be an employee related issue and not so much a developer issue.

0

Companies make decisions quite differently from developers. Most have mechanisms at place providing appropriate hardware for the task, having approved purchase channels, groups responsible for installations, testing, compliance with security and other measures. So questions changing hardware specifications can be complicated.

On the other hand, let's say you came to CEO with a suggestion to spend equivalent of 1% of salary for upgrading equipment. He will ask CFO to come up with the hit they would have on the margins and income, let's say it's 5%. Now, missing the estimates, that may have an amplified effect on company's stock price, say 10%, and upper management loses their million dollar bonuses. Unless, there are good reasons to expect the upgrade would improve company's bottom line, this suggestion would be DOA. Companies always seek to increase expenditures only if it improves income. That means that in most cases both low end and high end equipment are sub-optimal.

One solution to satisfy both developers and company management would be allowing developers to pay the equipment rent, a typical system would run $20-200/month if rented for 2 years. A company can have a range of approved hardware and offer developers either a standard configuration or to choose upgraded configuration and deduct the additional rent from paycheck.

-3

The question pre-supposes that good hardware makes a significant difference, I recently switched to a Macbook Air, reduced CPU performance, less fan, less headache. I think that a far far greater factor is the human factor: what coding language? Are you using a dynamic programming language? What is the culture? Are you running a build every two seconds? Long (and unnecessary) test-suite runs? Far better to get the environment sorted out, rarely is a high-spec. development machine needed. The good software developers, the masters of the craft. What are they? They are writers. They write code for other coders to read. Actual function, speed, machine-deployment issues. These really are secondary. So is an obsession with correctness. I say relax more, and move to a right-on, open sourced language and toolset. This is where the company $ should be directed.

Dantalion
  • 101