53

As the title suggests, Does a longer Ethernet cable slow your connection down?

SidS
  • 633
  • 1
  • 5
  • 8

14 Answers14

43

No, it will not slow down a connection, but you need to be aware of the maximum length of a copper connection which is 100 meters. This needs to include the length of your patching cable from the host to the data point and also patch frame to the switch.

However, when using Cat 6 with a 10 Gbit/s interface, you can only use up to 55 meters and would need to use Cat 6A to achieve 100 meters for this type of transmission.

So if you are going above the specified maximum cable length, you will start to see problems not just relating to speed, but also to loss due to the nature of electrical current running through the cable.

The 100 meters only applies to a single run without any intermediary network device such as a switch. If you have a switch in between, you can obviously extend this from port to port which the maximum would apply to for each cable run from device to device.

Using fibre connectivity, you can extend the range based on what type of fibre and light which is beyond the scope of your question.

SleepyMan
  • 2,026
  • 11
  • 9
35

For all practical purposes, there will be no effect on the speed of your connection.

There will be a very insignificant amount of delay due to long cables. This won't affect the maximum speed of your connection, but it would cause some latency. pjc50 points out that it's about a nanosecond for every foot of cable length, which is a good rule of thumb used by many engineers when developing systems which are very dependent on latencies on those timescales.

In reality, you will never notice a difference. A "fast" ping time on the Internet is 10 ms, which is 10,000,000 ns. Adding even a few hundred feet of cable isn't going to have a noticeable effect at that point. In fact, nearly every step of the way involves delays which are more extreme than those seen from signal propagation. For example, most consumer grade routers will wait for the last byte of an incoming packet to be received and check it for errors before sending the first byte of the packet on its way. This delay will be on the order of 5,000 ns! Given that the maximum length of cable you can run (per the Ethernet spec) is 300 ft, the cable length could never cause more than 300 ns of delay due to the cable!

Cort Ammon
  • 451
  • 3
  • 3
19

Sort of, to a very tiny extent.

The longer your cable, the higher latency you experience - gamers call this "ping" time. However, the effect is about one nanosecond per foot of cable, which is unlikely to be noticeable in most cases. Especially as a single ethernet cable is limited to 100m.

This matters for high-frequency trading and occasionally for email.

It doesn't, of itself, affect the throughput or "bandwidth" of your cable.

pjc50
  • 321
  • 1
  • 4
10

The electric signal will be slowed down by a minimal amount (afterall it travels at 2/3 light speed, more exactly at 0.64c, the velocity factor), how much time does light take to travel for 100 meters?

timeTaken  = 100/(299792458*0.64) = 0.00000052 seconds

So it just takes an extra 0.00052 milliseconds which is just 520 CPU cycles (on a 1 Ghz CPU).

However the longer the cable the weaker the signal becomes, once the signal is weak enough it will starts to lose bits of information because of interferences, each time a bit is lost, something in the network layer sees that a checksum/parity check fails, and ask for that packet again.

Asking for a new packet will take a very long time.

So as long as signal is strong in the cable, the slowdown would be minimal (it is greater than I expected anyway).

Once you start losing information because cable too long, the slowdown would greatly increase.

Also note that certain communications protocols are timed, so if the cable is too long it may not even be usable because it would go out of sync (that's a by-design issue)

Zac67
  • 90,111
  • 4
  • 75
  • 141
CoffeDeveloper
  • 216
  • 1
  • 5
8

I believe it can, but not in the way most people are thinking about.

Most are thinking of the extra propagation delay through the cable itself. This is valid, but as people have already pointed out, so small that it's essentially always irrelevant.

There is another possibility though. Ethernet cables come in a few different varieties: cat 5, cat 5e and cat 6 are currently in (reasonably) wide use. Cat 5 doesn't officially support gigabit Ethernet, but with a short (e.g., 1 or 2 meter) cat 5 cable that's in good physical condition, you can often get a seemingly reliable gigabit connection anyway1.

With a longer cable, however, you could get enough signal deterioration that a gigabit connection was no longer possible. In this case, I believe you'd normally be a 100 megabit connection instead. In this case, you wouldn't just gain some irrelevant amount of latency--rather, you'd have lost a substantial amount of bandwidth.

This wouldn't have any effect on an Internet connection unless you happen to be one of the fortunate few with more than 100 MB/s bandwidth. Access to local resources could be affected much more drastically though.


  1. All of these use identical-looking RJ-45 connectors; the difference between cat 5 and cat 5e cable usually isn't obvious except by looking at the printing on the wiring to see which it says.
3

The standard is 100m (~333.33 ft; 1m = 3 1/3 ft) before attenuation makes the signal unusable, but the direct answer to your question is yes, a long cable can slow your connection. Attenuation is caused by the internal resistance of the copper which humans perceive as lag/slow down of network connectivity. If the cable is under 100m, the slow down is relatively unnoticeable. It can cause issues if you're coming close to that 100m mark though. And keep in mind that the 100m length is measured from the point the cable plugs into the port on your computer to the point it plugs into a device that regenerates the signal, like a switch or a router. (I've personally had to change out a cable to a printer because the ~97m length caused sporadic communication.)

Doug
  • 31
  • 1
3

In theory, yes.

According to Shannon-Hartley theorem, maximum achievable capacity of a channel with additive white Gaussian noise is [1].

[1] bandwidth * log(1 + SNR)

Long runs of cable decrease both its bandwidth (as high frequencies are dispersed) and SNR (as signal amplitude decreases).

fghkngfdx
  • 39
  • 1
2

The electrical signal propagation time for a 100 m maximum length Ethernet cable is only about half a microsecond. This is far less than the amount of time needed for your router, etc. to do their jobs.

This only begins to be relevant when looking at much larger distances: For example, from your computer to the server for a game you're playing; however that number is entirely in the hands of your ISP/its partners and the physical locations of you and the server itself.

2

There are two issues to consider, latency and signal integrity.

Latency is directly proportional to cable length. However, assuming we are talking about twisted pair Ethernet cables insidea a building the latency will be negligable compared to delays in equipment and in the long distance connections that make up the internet.

The other issue is signal integrity, if it gets too bad then the link will start dropping significant numbers of packets. TCP thinks dropped packets mean congestion and will drop it's speed accordingly.

If your cable is in-spec and your devices are in-spec and your distance is in-spec then packet loss should be negligable. However there is a lot of out of spec hardware out there, so I would be wary about operating right at the limit of the distance specification.

Peter Green
  • 13,882
  • 2
  • 23
  • 54
2

Yes. However,

  1. it is not called an Ethernet cable *
  2. it is not a connection, it's a transmission
  3. humans will not perceive delay introduced by cable length alone

*If you are speaking of a local area network you are probably referring to Category 5 or 6 cable. If you are speaking of a wide area link you are probably referring to single mode fiber optic cable.

Ronnie Smith
  • 4,419
  • 1
  • 14
  • 29
0

As a specialist in this line, I advice YES it does! But it is too little to affect you unless you had extended it too much. It also considers the quality of your cable line, connections, and others. But all of these are too minimal to get noticed. If you are talking of below 20 meters for a home, please do not bother to ask. These factors are for 100 meters and above. That is the reason why we have optical line.

ben
  • 1
0

Long cables will increase your latency since the signal has longer to go. This shouldn't matter much in your case since the signal propagates near the speed of light, the extra 10 meters will be imperceptible compared to the many miles to whatever server you are accessing. There will be some loss of signal over extremely long runs which will reduce bandwidth but shouldn't be significant over 20 meters, 100 meters is the point where you have to start worrying about the length of the run.

Joey Miller
  • 356
  • 1
  • 5
0

As we all know cat5, 5e cat6, 7 the maximum distance of a single run is 100m, as per technical standards cable length more than 100 m should not be used it will experience latency and packet drops . .

After 100m, your signal stength will drop off precipitously, due to loss of bandwidth from signal loss over the length of the run, increased noise from crosstalk, and leakage. You are experiencing this right from the server, it's not noticeable due to the overall strength of the signal under normal distance.

For systems that regularly exceed these runs (multiple floors or between buildings) we switch to Fiber Optic, which can run for miles with no noticeable loss.

Sagar Uragonda
  • 844
  • 1
  • 17
  • 74
-1

According to my own experience, it is a laughably misleading opinion.

In fact, it will hurt the latency since the longer distance, but not bandwidth. Supposing you are comparing the internet speed of using the 10m fiber and 20m fiber, you will find that electrical signals travel through an ethernet cable at the speed of light, or 299,792,458 meters per second. That means that it takes about 0.00000003 seconds for the signal to travel through your 10m cable, and 0.00000006 seconds to travel through your 20m cable. You cannot notice the difference here. Not mention if you compare to many miles to whatever server you are accessing.

However, longer distance transmission indeed have signal loss and noise issue, which might have an impact on the internet speed.

Mark Twain
  • 334
  • 1
  • 3