propagation delay - Computer Definition
The time required for a signal to travel from one point to another, generally from a transmitter through a medium to a receiver. Propagation delay is dependent on the nature of the electromagnetic signal, as not all signals travel at the same speed through a medium. Propagation delay also is influenced by the distance between the two points, the density of the medium, and the presence of passive devices such as loading coils that might increase the impedance of the medium. See also impedance, loading coil, medium, and velocity of propagation (Vp).
Used by arrangement with John Wiley & Sons, Inc.
The time it takes to transmit a signal from one place to another. Propagation delay is dependent solely on distance and two thirds the speed of light. Signals going through a wire or fiber generally travel at two thirds the speed of light. Contrast with nodal processing delay.
Computer Desktop Encyclopedia
THIS DEFINITION IS FOR PERSONAL USE ONLY
All other reproduction is strictly prohibited without permission from the publisher.
© 1981-2014 The Computer Language Company Inc. All rights reserved.