Physicians in the United States are doctors who practice medicine for the human body. They are an important part of health care in the United States. The vast majority of physicians in the US have a Doctor of Medicine (MD) degree, though some have a Doctor of Osteopathic Medicine (DO) or Bachelor of Medicine, Bachelor of Surgery (MBBS). The American College of Physicians, uses the term physician to describe specialists in internal medicine, while the American Medical Association uses the term physician to describe members of all specialties.