Dominionism Definition

də-mĭnyə-nĭzəm
noun
The theory or doctrine that Christians have a divine mandate to assume positions of power and influence over all aspects of society and government.
American Heritage
The belief that God gave humans the right to exercise control over the natural world.
American Heritage

Find Similar Words

Find similar words to dominionism using the buttons below.

Words Starting With

Words Ending With

Unscrambles

dominionism