Friday, September 15, 2017

Illustrating the Difference Between Bandwidth and Latency

Earlier this week my AP CS Principles classes were discussing the difference between latency and bandwidth. My curriculum resource (I’m using the curriculum from code.org) describes these two words like this:

  • Bandwidth - Transmission capacity measure by bit rate
  • Latency - Time it takes for a bit to travel from its sender to its receiver.

These are useful descriptions but how to make that real for students? Today I found two great examples of high bandwidth but low latency.

BandwidthOne was in the form of this graphic on the right. The latency is slow. It takes a lot of time for the pigeon to fly to the other site. So moving one bit or even byte at a time would not be a good idea. On the other hand since all the data gets to the destination at the same time the bandwidth is high. The story is a bit dated but still interesting.

I shared that graphic on Facebook and a friend of mine who works for Amazon Web Services pointed me to a more current story.

Amazon’s Snowmobile is Actually a Truck Hauling a Large Hard Drive

From the article:

“Using multiple semis to shuttle data around might seem like overkill. But for such massive amounts of data, hitting the open road is still the most efficient way to go. Even with a one gigabit per-second connection such as Google Fiber, uploading 100 petabytes over the internet would take more than 28 years. At an average speed of 65 mph, on the other hand, you could drive a Snowmobile from San Francisco to New York City in about 45 hours—about 4,970 gigabits per second.”

Now that is a story that speaks to moving really large amounts of data. And again, showing the difference between latency and bandwidth.

1 comment:

  1. Garth2:49 PM

    I use the water hose comparison. Big hose with low water pressure is large bandwidth, slow latency. Small hose, high water pressure is the opposite. Big hose with lots of water pressure is glorious.

    ReplyDelete