Women's Studies

CollegesCollege Majors › Women's Studies

Women's studies majors are often interested in looking at things through a feminist lens. When you major in women's studies, you have the chance to learn more about the history of gender in society and potentially make a difference in the world.

You'll take classes that focus on women's health, famous women in history, the history of feminism, and women in literature and pop culture. To do well in these classes, students need the ability to think critically, question their own beliefs, and listen to the opinions of others.

Because a women's studies degree is so broad, you'll have experience in several areas when it's time to choose a career path. Women's studies majors have gone on to work in journalism, public health, social work, politics, education, and the nonprofit sector.
Sign Up or Sign In to connect with colleges.

Filter by:

Clear all selections Apply filter

Annual Tuition

College Type