Black Women Health Imperative
The Black Women’s Health Imperative (BWHI) is a pioneering and vital organization committed to addressing the unique health challenges faced by Black women in the United States. Founded in 1983, the BWHI has been at the forefront of promoting the health and well-being of Black women and girls through advocacy, research, education, and policy initiatives. …