binary coded decimal
binary coded decimal definition by American Heritage Dictionary
noun Computer Science Abbr. BCDThe American Heritage® Dictionary of the English Language, 4th edition Copyright © 2010 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.
A code in which a string of four binary digits represents a decimal number.
binary coded decimal - Computer Definition
binary coded decimal - Science Definition
A code in which a string of four binary digits represents each decimal number 0 through 9 as a means of preventing calculation errors due to rounding and conversion. For example, since the binary equivalent of 3 is 0011 and the binary equivalent of 6 is 0110, 36 is represented as 0011 0110.