Terrain Theory Definition

noun

An obsolete theory of disease , proposing that a weak body attracts germs.

Wiktionary