Book Review: Women in American History: A Social, Political, and Cultural Encyclopedia and Document Collection
Abstract
To this day, high school and college students rarely learn about the role of women in American history, cultures, or politics. Teachers and textbooks still focus predominantly on the white Christian heterosexual males that continue to take most of the credit for building the United States of America. While it is fact that, for most of American history, only white men could own land, vote, and serve in government, women of all races, religions, and sexual orientations have done a great deal to advance American culture, fight for justice, and impact the laws, businesses, scientific research, and education systems that have developed in the United States over time.
DOI: https://doi.org/10.5860/rusq.57.1.6465
Refbacks
- There are currently no refbacks.
© 2024 RUSA