computer generations - Computer Definition
Following is a brief summary of the generations of computers based on their hardware and software architecture. First Generation In the late 1940s and early 1950s (EDSAC, UNIVAC I, etc.) computers used vacuum tubes for their digital logic and liquid mercury memories for storage. See early memories, EDSAC and UNIVAC I. Second Generation In the late 1950s, transistors replaced tubes and used magnetic cores for memories (IBM 1401, Honeywell 800). Size was reduced and reliability was significantly improved. See IBM 1401 and Honeywell. Third Generation In the mid-1960s, computers used the first integrated circuits (IBM 360, CDC 6400) and the first operating systems and database management systems. Although most processing was still batch oriented using punch cards and magnetic tapes, online systems were being developed. This was the era of mainframes and minicomputers, essentially large centralized computers and small departmental computers. See punch card, System/360 and Control Data. Fourth Generation The mid to late-1970s spawned the microprocessor and personal computer, introducing distributed processing and office automation. Word processing, query languages, report writers and spreadsheets put large numbers of people in touch with the computer for the first time. See query language and report writer. Fifth Generation - The Future As of the 21st century, we are entering the fifth generation, which increasingly delivers various forms of artificial intelligence (AI). Faster hardware and much more sophisticated search and natural language recognition are major features. See AI, virtual assistant and natural language recognition.