Gigavision - Computer Definition
A digital camera sensor technology from the Technical University of Delft in the Netherlands. Developed by Edoardo Charbon and team, Gigavision uses memory chips for the pixels. Memory cells are sensitive to light and are at least an order of magnitude more dense than the traditional CCD and CMOS sensors used for image capture. Gigavision focuses the light over an array of cells that yields a binary number for the pixel. Because the pixels are already in digital form, additional analog-to-digital circuitry and conversion is not necessary as it is in CCD and CMOS cameras.