display adapter - Computer Definition
A plug-in card in a desktop computer that performs graphics processing. Also commonly called a "graphics card" or "video card," modern display adapters use the PCI Express interface, while earlier cards used PCI and AGP. The display adapter determines the maximum resolution, refresh rate and number of colors that can be displayed, which the monitor must also be able to support. On many PC motherboards, the graphics circuits are built into the chipset, and a separate plug-in card is not required. 1 - The Graphics Pipeline The modern display adapter performs two operations. The first is the graphics rendering, which moves the graphics data through a pipeline that creates the image frames and adds texture and shading, all functions previously done by the CPU in the first PCs. A high-end display adapter is a sophisticated parallel processing computer. See GPU and graphics pipeline. 2 - Analog and Digital Outputs The second and more elementary purpose is to continuously convert the graphic patterns (bitmaps) that have been rendered in the memory frame buffers into signals for the monitor's screen. The first PC display adapters (CGA, EGA, PGA) output digital signals, and the monitor converted them to analog for the CRT. Starting with VGA in 1987, adapters sent analog signals to the monitor; however, modern adapters output digital DVI or DisplayPort. Flat panel monitors use DVI but also include a VGA socket to accommodate older machines. On laptops, the display circuitry has been digital from end to end. See shared video memory and how to select a PC monitor.