Health Care Definition
  healthcare
  
    noun
  
 The prevention, treatment, and management of illness and the preservation of mental and physical well-being through the services offered by the medical and allied health professions.
 American Heritage Medicine 
Synonyms: 
  
    adjective
  
 Of or relating to health care.
 The health care industry.
 American Heritage Medicine 
Related Articles
Find Similar Words
Find similar words to health care using the buttons below.





