AstroNerd23
Active member
Women’s sports are gaining more attention than ever before 
. From the U.S. Women's National Soccer Team's fight for equal pay to increasing coverage of women’s leagues, the gap between men’s and women’s sports is slowly closing.

