Health Care Definition
healthcare
noun
The prevention, treatment, and management of illness and the preservation of mental and physical well-being through the services offered by the medical and allied health professions.
American Heritage Medicine
Synonyms:
Advertisement
adjective
Of or relating to health care.
The health care industry.
American Heritage Medicine
Advertisement
Related Articles
Advertisement
Find Similar Words
Find similar words to health care using the buttons below.