Peace Corps definition by Webster's New World
Peace Corps definition by American Heritage Dictionary
Peace Corps - Cultural Definition
An agency of the United States government that sends American volunteers to developing nations to help improve living standards and provide training. Created by President John F. Kennedy in 1961, under the auspices of the Department of State, the Peace Corps provides an opportunity to share American wealth, technology, and expertise. During the cold war it also served as a means for spreading American influence and values in the hope of preventing developing nations from allying themselves with the Soviet Union.