Unit interval (data transmission)

From Wikipedia, the free encyclopedia

The unit interval is the minimum time interval between condition changes of a data transmission signal, also known as the pulse time or symbol duration time. A unit interval (UI) is the time taken in a data stream by each subsequent pulse (or symbol).

When UI is used as a measurement unit of a time interval, the resulting measure of such time interval is dimensionless. It expresses the time interval in terms of UI. Very often, but not always, the UI coincides with the bit time, i.e. with the time interval taken to transmit one bit (binary information digit).

The two coincide in fact in NRZ transmission; they do not coincide in a 2B1Q transmission, where one pulse takes the time of two bits. For example, in a serial line with a baud rate of 2.5 Gbit/s, a unit interval is 1/(2.5 Gbit/s) = 0.4 ns/baud.

Jitter measurement[edit]

Jitter is often measured as a fraction of UI. For example, jitter of 0.01 UI is jitter that moves a signal edge by 1% of the UI duration.

The widespread use of UI in jitter measurements comes from the need to apply the same requirements or results to cases of different symbol rates. This can be done when the phenomena investigated are not independent from the symbol duration time but closely related to it. For example, UI is used to measure timing jitter in serial communications or in on-chip clock distributions.

This measurement unit is extensively used in jitter literature. Examples can be found in various ITU-T Recommendations,[1] or in the tutorial from Ransom Stephens.[2]

See also[edit]

References[edit]

  1. ^ ITU-T G.825 TRANSMISSION SYSTEMS AND MEDIA, DIGITAL SYSTEMS AND NETWORKS Digital networks. Quality and availability targets (03/2000)
  2. ^ Tektronix Jitter 360° Knowledge Series rom http://www.tek.com/learning/