square wave - Computer Definition
A periodic wave that assumes one of two fixed, discrete values for equal lengths of time, with each value or group of values representing a digital bit. For example, a given signaling protocol might represent a one (1) bit as +3V (volts) and a zero (0) bit as
Used by arrangement with John Wiley & Sons, Inc.
A waveform that rises quickly to a particular amplitude, remains constant for a time period and drops fast at the end. In digital systems, square waves are the norm, because they represent a binary digit (0 or 1). Square waves can also be generated in musical synthesizers and have a raspy sound.
Computer Desktop Encyclopedia
THIS DEFINITION IS FOR PERSONAL USE ONLY
All other reproduction is strictly prohibited without permission from the publisher.
© 1981-2014 The Computer Language Company Inc. All rights reserved.