The definition of herbal medicine is the use of plants to prevent and treat an illness, or to achieve good health, as well as the drugs and tinctures that are used.
(noun)See herbal medicine in American Heritage Dictionary 4
Learn more about herbal medicine