Within health care, there has long been a gender division of professional labor: men have predominated in higher-status, higher-paying professions like medicine and dentistry, while women’s health care work has been clustered in so-called support occupations such as nursing. Historically, health care professions were gendered, and beliefs about gender came to be embedded in professional work. Recently, however, traditional gender divisions of labor are being challenged by the feminization of professions in the United States and Canada. Women’s participation is expanding in traditionally male-dominated professions. This article explores the nature and causes of this feminization and considers whether feminization is changing the significance of gender to health care employment.