Wild West Definition

noun
The western United States during the period of its settlement, especially with reference to its lawlessness.
American Heritage
pronoun

The western United States during the 19th-century era of settlement, commonly believed to be lawless and unruly.

Wiktionary

(by extension) A place or situation in which disorderly behavior prevails, especially due to a lack of regulatory oversight or an inadequate legal system.

The CEO commented that the Russian business environment of the 1990s was the Wild West.
Wiktionary

Find Similar Words

Find similar words to Wild West using the buttons below.

Words Starting With

Words Ending With

Unscrambles

Wild West