Latency

Why Trust Techopedia

What Does Latency Mean?

Latency is a perceived or actual delay in response time.

Advertisements

In networking, latency describes the delay in time it takes a data packet to travel from one network node to another. The term is also used to describe delays that can occur as data moves between a computing device’s RAM and its processor.

High latency creates bottlenecks and is associated with low quality of service (QoS), jitter and a poor user experience (UX). The impact of latency can be temporary or persistent based on the source of the delays.

Latency on the internet is often measured with a network tool called Ping or a diagnostic command called traceroute. To minimize latency in application performance, developers can use cache engines and buffers.

Techopedia Explains Latency

In data communication, digital networking and packet-switched networks, latency is measured in two ways: one-way trip and round trip. One-way latency is measured by counting the total time it takes a packet to travel from its source to its destination. Round-trip latency is measured by adding the time it takes the packet to arrive back at the source. Unlike one-way latency, round-trip latency always excludes processing time at the destination point.

Causes of Latency

In network transmission, the following four elements are involved in latency:

  1. Delay in Storage: Delays can be introduced by reading or writing to different blocks of memory.
  2. Device Processing: Latency can be introduced each time a gateway takes time to examine and change a packet header.
  3. Transmission: There are many kinds of transmission media and all have limitations. Transmission delays often depend on packet size; smaller packets take less time to reach their destination than larger packets.
  4. Propagation: It’s going to take time for a packet to travel from one node to another, even when packets travel at the speed of light.

Latency, Bandwidth and Throughput

Latency, bandwidth and throughput are sometimes used as synonyms, but the three terms have different meanings in networking. To understand the differences, imagine network packets traveling through a physical pipeline.

  • Bandwidth describes how many packets can travel through the same pipeline at one time.
  • Latency describes how fast the packets travel through the pipeline.
  • Throughput describes the number of packets that can travel successfully through the pipeline in a given time period.

RAM Latency

Random access memory latency (RAM latency) refers to the delay that occurs in data transmission as data moves between RAM and a device’s processor.

RAM latency can be manually adjusted using fewer memory bus clock cycles. Speeding up memory isn’t necessary for most users, but may be helpful for gamers who prefer to overclock their systems.

Advertisements

Related Terms

Margaret Rouse
Technology Specialist
Margaret Rouse
Technology Specialist

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.