These 2015 Women's Sports Highlights Will Change How You See Body Image

It's been a powerful year for women in sports – from the U.S. women's national soccer team winning the World Cup to Serena William's Grand Slam at Wimbledon. Female athletes have shown us that sports have a powerful impact on girls and women's confidence — and studies prove it.  Read More »