0

Intel claims that a computer will become slower after it is used for a long time while the power consumption remains almost the same according to some media source.

At first I thought this was just marketing, but then some friend confirmed this is truly the case. I can understand when bicycle is used for a long time the friction increases within the component, but that can be fixed anyways. What about the chips? Is the capacitor that takes longer to charge?

is this really the case or it is just marketing ?

source media material if you can read Chinese

source transistor aging thanks to @sergio

clarification I am definitely not talking about software related stuff, otherwise wont post here.

stanri
  • 5,382
  • 2
  • 29
  • 56
zinking
  • 167
  • 7
  • 3
    Could you please give us the source or a link to the Intel claim you're referring to? – Ricardo Jul 20 '14 at 02:20
  • 3
    Post the link to Intel's claim, please. I bet, Intel have published an English version somewhere. – Nick Alexeev Jul 20 '14 at 02:37
  • 3
    Set computers aside for a moment: I have 15 year old digital dictaphones and music players, which have appreciably slowed down in their menu response / LCD screen updates over the years. On a whim, I opened up one such device, found a thin layer of dust all over the tiny PCB. Cleaned the PCB with IPA, cleaned up battery contacts. The device refreshes its display and accepts inputs appreciably faster now. My conjecture is that the dust was causing, among other things, heat build-up in the device, and for some reason this resulted in a slow-down. Could something similar be true here too? – Anindo Ghosh Jul 20 '14 at 08:26
  • @AnindoGhosh I bet it's all only you and has nothing to do with the device's real performance. A warmed up LCD display is actually faster than a cold one. Slow menu reponse might be due to inefficient algorithms or simply algorithms that take appreciably longer as you put more data on the device. The dust on the PCB on those devices is inconsequential when it comes to speed. Really. – Kuba hasn't forgotten Monica Jul 21 '14 at 07:56
  • 2
    @KubaOber Umm I'd have to say your view is about as speculative as mine - and if you look at my post history, you'll probably realize that before speculating, I actually measured time for response to specific sequences of button clicks before and after cleaning. No storage content was erased in my efforts, either, so amount of data isn't a factor. – Anindo Ghosh Jul 21 '14 at 08:42
  • @AnindoGhosh Hopefully you used electomechanical actuators as the source of clicks, and a logic analyzer capture of the power-up, button presses and LCD output. Surely you've started with a flash memory load, right? With human in the loop things can go spectacularly bad. Similarly they will go bad if you don't ensure that the conditions are exactly identical when it comes to memory contents. Just because you didn't change the memory doesn't mean the firmware wasn't doing housekeeping in the background, for example. Extraordinary claims require extraordinary care in data collection. – Kuba hasn't forgotten Monica Jul 21 '14 at 11:42
  • @AnindoGhosh And surely you can reproduce the results by heating it all up much more than it heats up by itself, right? I mean, it's trivial to get it to 60C in an oven and have it molasses slow afterwards, right? – Kuba hasn't forgotten Monica Jul 21 '14 at 11:47
  • 1
    @AnindoGhosh The experimental technique in a nutshell was given by Feynman in the 1974 CalTech commencment address. – Kuba hasn't forgotten Monica Jul 21 '14 at 11:48
  • Just speculating ... if modern PCs (CPUs?) have PLLs and temperature sensors, it might make sense that as the processor gets hot, they reduce the speed. If aging (plus dirt) causes a temperature increase, the PC may deal with that by reducing the speed. – Tut Jul 21 '14 at 14:33
  • 1
    @KubaOber As it happens, I live in Mumbai, so getting to 60 C is a small jump from ambient :-) It's more of a struggle to keep the temperature below 30! Yes, I've checked that the old Yamaha voice recorder slows down hugely when operated in the kitchen (40+ degrees) compared to my lab (27 degrees approx), and no mechanical actuators or electronic timing is needed to distinguish between 7 seconds from click to play, versus ~1 second between click and play. Your sarcasm is amusing, I must admit. Keep at it. – Anindo Ghosh Jul 21 '14 at 15:02
  • @AnindoGhosh You've measured the clock's tempco, right? – Kuba hasn't forgotten Monica Jul 21 '14 at 15:06
  • @kubaober Of course not, it doesn't affect my speculation about the cause of the slow down, and I am not attempting a root cause analysis, merely a speculation. – Anindo Ghosh Jul 21 '14 at 15:09
  • 1
    There is a phenomenon that occurs in integrated circuits which is similar in cause and appearance to the formation of "oxbow bends" in rivers. Current flowing through a thin plated wire causes (don't ask me for the mechanism) metal atoms from the "inside" side of a bend in the wire to be lifted up and redeposited on the "outside" of the bend. The most worrisome aspect of this phenomenon is to cause shorting between adjacent wires, but it also causes wires to become successively thinner, presumably impacting performance in a minor way. – Hot Licks Jan 21 '15 at 19:32
  • 1
    And, somewhat less exotic, in any semiconductor circuit atoms will migrate over time as entropy attempts to make the circuit become one homogenous blob. This causes the boundaries between, say, the P and N layers of a transistor to get more and moe "fuzzy" over time. The effect is accelerated by heat, of course. – Hot Licks Jan 21 '15 at 19:36

6 Answers6

8

Well let's just ignore the software bloat aspect of computer slow down and look at things on a piece by piece basis.

CPU, MoBo, Memory like caches and DRAM:

  • NO, these are clocked systems and the clock is designed to drive the system just fast enough to make sure that the device finishes one task before starting the next (in simplistic terms). These clocks do drift and age a little, but that is measured in PPM (parts per million) if the part slows down enough that it can't keep up to the clock, then it starts to have major problems. Like most things digital, it runs until it , .... well doesn't.

That really just leaves the HDD:

  • It is POSSIBLE that this will slow your machine down. We do know that there can be corruption in the stored bits (bit rot) and the operating system can mostly correct these. But in reality these tend to cascade and then go south very quickly or else they happen at a consistent background rate (which wouldn't affect this premise).

  • I would say that there would be some machines that slow down rapidly and spiral out. Otherwise the effect will probably not be noticeable.

On the whole, I'd say nonsense.

placeholder
  • 30,170
  • 10
  • 63
  • 110
  • "It is POSSIBLE that this will slow your machine down." It won't ever slow your machine down - how? By divine intervention? Not - unless you manually reduce the clock rates or multipliers to lower the error rates. There must be a mechanism that adjusts the clock based on the CPU's error rates, and that mechanism is simply absent in usual computer designs. With any modern PC, the only way to do it would be to manually tweak the BIOS settings. In that case, it's the user who slows down the computer, in order to maintain the error rates acceptable to her. – Kuba hasn't forgotten Monica Jan 21 '15 at 16:41
  • If the hard drive is badly fragmented performance can suffer. – Pete Becker Jan 21 '15 at 20:51
7

There are lots of things that can cause a Windows computer to become slower as you use it, but they're all software-related or related to the interaction of software and hardware.

Additional programs get installed which consume resources (even when they're uninstalled they may leave crap in the registry). Many programs install processes that run at startup and do things like listening on a port or checking for updates.

Disk drives can become fragmented, slowing retrieval. Even single programs that use RAM can require garbage collection and such like as the RAM gets filled and recycled in various sized blocks.

Some people (not me) swear by completely wiping and re-installing Windows and most-used programs from scratch on a clean disk drive. You will definitely see a speed improvement if you've been using the same computer for years without re-installing from scratch.

Spehro Pefhany
  • 397,265
  • 22
  • 337
  • 893
  • I think, the phenomena, which the O.P. is groping for, are more along the lines of slow diffusion of semiconductor dopants over time. – Nick Alexeev Jul 20 '14 at 03:07
  • 1
    I am definitely not talking about software related stuff, otherwise wont post here. it sounds like people don't think this true either. this actually fit with my understanding. – zinking Jul 20 '14 at 03:08
  • 1
    @NickAlexeev a computer would not slow down with whatever chip degradation we could imagine, it would simply fail to operate reliably at the design speed. The crystals and clock generators won't slow down appreciably. – Spehro Pefhany Jul 20 '14 at 03:33
  • @SpehroPefhany I'm guessing that parasitic capacitances may change. However, we're still waiting for references be posted by the O.P. – Nick Alexeev Jul 20 '14 at 03:36
  • @NickAlexeev I don't have any more reference, the attached is where I started this question. and I have modified the question description as well. – zinking Jul 20 '14 at 03:54
  • 1
    NAND flash erases get longer with the number of erases performed. If you have an SSD in you computer it can, eventually, get slower as it takes longer to erase blocks. – Majenko Jul 20 '14 at 09:35
  • RAM chips already have defects when they come out of the factory and serious error correction algorithms and redundant circuitry needs to be there to increase yield. As chips age, errors are increasing so more effort is spent correcting those errors (I am not 100% sure about this and thus I leave it as a comment). SSDs also become slower as the firmware keeps track and tries to evenly distribute write operations over the disc surface so the more it is being used, the harder it is to find free space and files might need to be fragmented. – Evan Jul 20 '14 at 22:40
  • 1
    @EvangelosEm Do you have a cite on that RAM claim? – Spehro Pefhany Jul 20 '14 at 22:48
  • @SpehroPefhany: I'll have to scour through my books (Architecture Design for soft errors and Fault Tolerant Architecture) but here is a patent about increasing yield in ram chips by using error correction techniques: US 4335459 A - Single chip random access memory with increased yield and reliability and here is a random thesis I found that has plenty of references. – Evan Jul 21 '14 at 00:25
  • @Okay, just wondering 'cause I never heard of it. No big deal. – Spehro Pefhany Jul 21 '14 at 00:30
1

This discussion became a bit fragmented, I'll try to offer an alternative answer to this. Please, note that the original question mention the chips itself, so abtracting ourselves from all peripherals/sowftware inherited problems, I would say the answer is YES, computers could get slowly and/or present faulty behavour over time due to sillicon aging. As the silicon will loose it's properties over time, due to many factors as heat, the properties of the "internal components" of the processors and othes silicon based chips will gradually decay.

As you can see here, there are many articles studying this phenomenon.

I hope this helps as another piece for this puzzled matter! :)

Sergio
  • 761
  • 1
  • 8
  • 23
  • really glad to hear the other, because all answers are leading to the reference is not trust worthy. Put that aside, to your reference adding key word slow didn't show any relevant results. – zinking Jul 21 '14 at 14:18
  • 2
    The degradation of the transistors is where the problem resides, as you can read here (http://spectrum.ieee.org/semiconductors/processors/transistor-aging), where we can read "Over time[...] with a little more energy than the average will stray out of the conductive channel between the source and drain and get trapped in the insulating dielectric. This process [...] eventually builds up electric charge within the dielectric layer, increasing the voltage needed to turn the transistor on. As this threshold voltage increases, the transistor switches more and more slowly." – Sergio Jul 21 '14 at 14:40
1

Anindo Ghosh's discussion of temperature has been rubbished in the comments, but it's definitely a potential issue with computers. Modern systems will have "thermal throttling", which will reduce the clock speed when the die temperature of the CPU is above a limit. The ability of the system to keep cool diminishes with time; thermal paste dries out, dust accumulates on heatsinks, and thermal cycling of the heatsink mounting may worsen the thermal contact.

Most systems will benefit from dusting with compressed air every few months, and I've personally had good results from replacing heatsink thermal paste on overheating systems, especially laptops.

pjc50
  • 46,725
  • 4
  • 65
  • 126
0

No, it is not true for computers that have the typical, clocked architectures. An asynchronous CPUs real performance would be tied closely to the performance of the silicon and would scale with temperature, logic supply voltage(s) and aging.

It's a rather nice question, in fact: an accurate answer can be one sentence long :)

I stress that the transistor aging will not cause the computer to "slow down". A slowing down would imply that there is a control mechanism that measures CPU performance and slows down the clock sources so as to maintain acceptable performance of the CPU. It must be noted that similar to overclocking, a CPU with aged transistors will not fail outright. It will simply operate at a progressively higher error rate, eventually becoming unusable.

So, my first paragraph stands true whether the CPU exhibits transistor-age-related performance degradation or not.

If your question was about whether the transistors will slow down - please reword it so! You're asking about a computer as a whole, not about what the CPU's transistors might be doing. The transistors may be slowing down, but the computer will operate at its rated clock speeds (thermal and power throttling notwithstanding), with ever rising error rates, until it becomes useless. At that point a user might manually reduce clock multipliers in the BIOS to underclock the core and restore the functionality. I assume here that the highest-clocked circuitry is most prone to the effects of this aging. Only this manual control process will lead from transistor slow-down to a computer slow-down. Absent the control process, no slowing down can be observed by definition. You won't observe a slow down, but you will observe eventually unacceptable error rates - leading to data corruption on writable storage.

  • 1
    Your answer provides no evidence or reasoning to refute Intel's claim that the maximum achievable clock speed is reduced as transistors age. – Pete Kirkham Jul 21 '14 at 21:16
  • @PeteKirkham Everyone here seems to forget that a clock doesn't ask the CPU "Hey have you aged yet"? A clocked CPU will run at the clock speed, until it fails. The failure will exhibit as an error rate above acceptance criteria. It'll be gradual. Intel's claim might be true, but it's not practically relevant. All it means is that a CPU may fail due to old age, with the failure mechanism being transistor slow-down with aging. This might be an interesting fact, but still, it's simply an old-age failure mode. – Kuba hasn't forgotten Monica Jan 21 '15 at 16:29
  • @PeteKirkham The question isn't about whether Intel's claim is true, but about wheter if such claim were true the computer would progressively slow down. If the author asks something else, they are free to word it carefully as needed. – Kuba hasn't forgotten Monica Jan 21 '15 at 16:37
  • 1
    Modern motherboards automate the overclocking process. It's not manual, it's part of the bios. – Pete Kirkham Jan 21 '15 at 21:00
  • @PeteKirkham It is for all intents and purposes manual still, because the BIOS only has a very short window for its "go/no-go" testing. The only way it could be truly long-term automated - and perhaps that's how it's done today - is that the management core is taking measurements through built-in diagnostic structures of the chip, and throttles the clock accordingly. But it's not a part of the startup sequence, if that's what was implied. It would need to be a continuous process - it'd only result in observed part slowdown with age. Otherwise, the CPU will "just fail". – Kuba hasn't forgotten Monica Dec 18 '22 at 19:05
-1

Sorry to bump such an old thread. I'll break this down.

Semiconductors have thermal properties and can operate ideally forever if thermal degradation was not an issue.

Say you have a capacitor that goes from 0 volts to source Voltage at time zero

$$V_{source}*u(t) = RC\frac{dV_{capacitor}}{dt}+V_{capacitor}$$

The solution to this equation being

$$V_{cap}(t)=V_{source}*(1-e^{\frac{-t}{RC}})*u(t)$$

Now see the rate of change of the voltage at the capacitor

$$\frac{dV_{cap}}{dt}= \frac{V_{source}}{RC}*e^{\frac{-t}{RC}}$$

Which means as resistance increases (thermal degradation over time) it takes a longer amount of time for each gate to reach a steady state.

So you can bump up your ceiling voltage to compensate, or increase the period. If you do neither, you are just going to have a lot of error correction going on (i.e. lot of unknown values read from your cpu). It is said it takes about 5 time constants (RC) for your capacitor to reach a steady state. So increase your time constant you need to increase your period which reduces your frequency. So age and heat degrade a CPU, indeed making it slower and less accurate.

Also note that by increasing your voltage to compensate, you will increase your heat and accelerate the degradation.

  • 2
    Are you saying that a computer recognizes it needs to slow down, due to aging, and then the computer slows its own clock? – Marla Oct 12 '18 at 02:59
  • 1
    I think the OP was referring to simple fragmentation of the HDD over time, usually in days to weeks. Less of an issue now since Windows 7 and Linux, which defrag during idle moments. –  Oct 12 '18 at 05:29