Skip to main content

Difference between bandwidth and latency

Difference between bandwidth and latency


Difference between bandwidth and latency is something that confuses a lot of people, but if you are an IT person it would be useful to know the difference between the two because sooner or later you will face a network problem related to it. Part of the confusion has been created by Internet providers by always recommending increase of bandwidth to Internet speed related problem, but as we will see, an Internet connection speed is not always dictated by bandwidth. This part is very important.

What is difference between bandwidth and latency?

I will give you an analogy to make it easier to understand it if you are still confused. Imagine a highway with 4 lanes where the speed limit is 70 mph. Now on the Internet, bandwidth is the highway, and latency is the 70 mph speed limit. Now if you want to increase the amount of cars that travels through the highway you can add more lanes, but because the highway has too many curves, and bumps, you can’t increase the speed limit so all cars have to travel at 70 mph still. It doesn’t matter how many lanes the highway has, the cars will get to their destination at the same time regardless of the size of the highway. Why increasing bandwidth increases download speed then you might ask, is not that speed? No, by increasing bandwidth you increase capacity not speed. Following the highway analogy, imagine that vehicles traveling through that highway were all trucks with house bricks for delivery. All trucks have to travel at 70 mph, but once they arrive at their destination instead of delivering 4 loads of bricks, 6 loads are delivered because 2 more lanes were added to the highway. The same thing happens when you add bandwidth to an Internet connection, the capacity is increased but the latency (speed) stays the same.
Why increasing bandwidth increases download speed then you might ask, isn’t that speed? No, by increasing bandwidth you increase capacity not speed. Following the highway analogy, imagine that vehicles traveling through that highway were all trucks with house bricks for delivery. All trucks have to travel at 60 mph, but once they arrive at their destination instead of delivering 4 loads of bricks, 6 loads are delivered because 2 more lanes were added to the highway. The same thing happens when you add bandwidth to an Internet connection, the capacity is increased but the latency (speed ) stays the same.

What`s Bandwidth? 

To understand bandwidth and latency we need to have a clear definition of both, so let’s start with bandwidth. Bandwidth is the amount of data that can be transferred from one point to another normally measured in seconds. Internet providers normally advertise their Internet bandwidth like this 40/40 Mbps which means that 40 megabits/s of data can be uploaded or downloaded from any site in a second for example. Most people are only familiar with bandwidth measuring terms like Megabytes, Gigabytes, but Internet providers still use the Megabit metric because it makes the numbers look bigger but in reality a 40/40 Mbps connection is only about 4 megabytes. The important thing to remember about bandwidth is that bandwidth is not speed.

What`s Latency?

Latency is the time that a data packet takes to travel from one point to another. Another term for Latency is delay. One important thing to remember about latency is that it is a natural phenomenon postulated by Einstein in the theory of relativity.  In our universe, everything needs time to travel. So on the Internet, the time it takes a packet to travel (from google data center to your computer for example) it’s called Latency.


What causes latency?

Latency is caused by the distance and the quality of the medium that the Internet packets travel through. For example, the latency through a fiber optic connection is shorter than through a copper wire cable, but latency through a copper wire cable is shorter than through a satellite connection, etc.  Satellites use the microwave spectrum to relay data connections from space

When is latency a problem?

Some people don’t notice and don’t care about latency as long as web pages, garwarepoly, miniteksystems, and other multimedia stuff load fast. Latency becomes a problem only when real time data transfer is necessary. For example, VOIP calls, online face to face meetings, etc. I have calculated that any latency beyond 200ms will give you problems in real time communication

How to test latency?

The quickest way to test latency from your computer is using the ICMP protocol with the Ping command. For example, if I want to test the latency between my computer and gmail data center, I will type in the command ping gmail.com

Conclusion

I hope your understanding between bandwidth and latency is clearer. If you have any suggestion or comment, please use the comment box (this is most important for me. 

Why does latency vs. bandwidth matter?

Gaming actually doesn’t need “fast” internet in the sense that it’s usually marketed. Bandwidth is relatively unimportant (except for getting game patches faster), but latency is key. Low ping times are highly-prized in fast-clicking twitch games like Call of Duty or Battlefield.
Streaming video or audio is mostly a matter of bandwidth, but latency can also cause problems. In theory a high-bandwidth connection with high latency would work, but in practice it rarely does. Most streaming services aren’t equipped to buffer long enough to stream seamlessly, even with a major buffering wait at the beginning. There are apps like NightShift that pre-load streaming content for high-latency users, but at that point you’re essentially downloading videos and not streaming at all.
Video chat works best with low latency. It will still work with high latency, but you’ll have to work with the awkward delay in the conversation. HD video streaming will require more bandwidth, but the blurry sort of chat most of us are used to will only require a couple Mbps.
Browsing works best with decent latency. While you can get by with high latency, it can be frustrating that you have to wait several seconds every time you visit a new page. More bandwidth doesn’t hurt, either, especially if you’re looking at high-res images, gifs, or video.
Difference between bandwidth and latency

Comments

Popular posts from this blog

10 things that android phones can do, apple iPhone can`t do

10 things that android phones can do, apple iPhone can`t do Android and iOS both operating systems offer a lot of features (most of them commonly found in both), there are always a few that are exclusively available on that OS. Here are 10 such features available on Android, but missing from iPhones' operating system. Instant app: test apps before downloading This feature allows users to test the apps before they decide to download. Compatible with all Android devices operating on Jelly Bean or higher, this feature is still missing for iPhones. Set DATA Limit alerts  In the settings option of your Android smartphone, users have the choice to set data limit with a mode called the Data Saving Mode. This allows users to limit the background data consumption in case there is a dearth of available data. And you cannot set limit in iPhone.  Records phone apps  Thanks to the customized UIs of the various Android smartphones, the option to record a call is prese

What is DNS?

What is DNS? The Domain Name System (aka DNS) is used to resolve human-readable hostnames like www.notesshow.blogspot.in into machine-readable IP addresses like 204.13.248.115. DNS also provides other information about domain names, such as mail services. But why is DNS important? How does it work? What else should you know? History of the DNS When the Internet was still in its infancy when you wanted to visit a website you had to know the IP address of that site. That’s because computers are and were only able to communicate using numbers. It’s long, hard to remember, and we (humans, I presume) are not robots. We needed a way to translate computer-readable information into human-readable. And it had to be fast, lightweight. DNS In the early 1980’s, Paul Mockapetris came up with a system that automatically mapped IP addresses to domain names. And the DNS was born. This same system still serves as the backbone of the modern Internet, today. And yet, only a small subs