chip rate - Computer Definition
In direct sequence spread spectrum technologies such as DSSS and CDMA, it is the number of bits per second (chips per second) used in the spreading signal. A different spreading signal is added to the data signal to code each transmission uniquely. The number of chips (bits) in the spreading signal is significantly greater than the data bits. Chip rate is measured in "megachips per second" (Mcps), which is millions of chips per second. See spread spectrum, CDMA and 802.11.