Rohit Vijay

Need for Speed - Higher bandwidth, Lower Latencies

Blog Post created by Rohit Vijay Employee on Dec 8, 2014

The reach of internet today is unlike any other technology. Billions of connected devices, a ever growing userbase for a number of online services and hence the demand for High performance websites or applications. No matter how we are connecting to the internet, when it comes to online experience our expectations are always high - “Speed is my online right, and I shall have it”. For business performance is the key to higher user engagement, higher user retention and higher conversion rates.

 

So what are the critical components which dictate the performance of all network traffic?

Latency & Bandwidth - Lets take a little closer look at these terms, what they mean and how they dictate performance.

 

Latency - Latency is the time it takes for a data packet, to travel from its point of origin to the point of destination.

But over the internet there is no direct channel between sender and receiver. The data packets get transmitted hop by hop and each of these components contribute to the overall time it takes for the packet to get delivered.

 

Latency can be broken down as:

  • Propagation delay  - Amount of time required for a message to travel from the sender to receiver, which is a function of distance over speed with which the signal propagates.
  • Transmission delay - Amount of time required to push all the packet’s bits into the link, which is a function of the packet’s length and data rate of the link.
  • Processing delay    - Amount of time required to process the packet header and determine the packet’s destination.
  • Queuing delay      - Amount of time the incoming packet is waiting in the queue until it can be processed.

The total latency between client and server is the sum of all these delays. So what causes these delays?

 

Propagation delay is dictated by distance and medium through which packet travels. What is the distance between client and server, what is the medium - copper wire, optical fiber.

Transmission delay is dictated by the available data rate of the transmitting link. 10 Mb file to be transmitted over 1 Mbps or 10 Mbps link.

Processing delay is dictated by how fast the router can process each packet header to determine outgoing route and to run other checks on data.

Queuing delay is dictated by number of packets the router is able to process per second. If packets are arriving at a faster rate than the router can process then the packets are queued in routers incoming buffer.

 

Each packet traveling over the network will incur many instances of each of these delays. The farther the distance between the source and destination, the more time it will take to propagate. The more intermediate routers we encounter along the way, the higher the processing and transmission delays for each packet. Finally, the higher the load of traffic along the path, the higher the likelihood of our packet being delayed inside an incoming buffer.


Bandwidth - Bandwidth is the maximum throughput of a logical or physical communication path.

Core network (internet backbone), the optical fiber links which form the core data paths of Internet are capable of bandwidth capacity in hundreds of terabits per second for each cable.However, the available capacity at the edges of the network is much, much less, and varies wildly based on deployed technology: dial-up, DSL, cable, a host of wireless technologies, fiber-to-the-home, and even the performance of the local router.

 

The available bandwidth to the user is a function of the lowest capacity link between the client and the destination server.

Screen Shot 2014-12-07 at 1.48.50 AM.png

 

We are constrained in out ability to make the packets travel faster in communication medium beyond a certain point. But there is good scope of improvement when it comes to latency. We can employ various techniques(caching, prefetching etc.) by which the latency can be decreased. By making sure content is available very close to the end user we can cut down on different factors which contribute to increasing latency and hence improve performance.

Outcomes