square wave - Computer Definition
A periodic wave that assumes one of two fixed, discrete values for equal lengths of time, with each value or group of values representing a digital bit. For example, a given signaling protocol might represent a one (1) bit as +3V (volts) and a zero (0) bit as
A waveform that rises quickly to a particular amplitude, remains constant for a time period and drops fast at the end. In digital systems, square waves are the norm, because they represent a binary digit (0 or 1). Square waves can also be generated in musical synthesizers and have a raspy sound.