also health·care noun
The prevention, treatment, and management of illness and the preservation of mental and physical well-being through the services offered by the medical and allied health professions. adjective
Of or relating to health care: the health care industry.
Learn more about health care