RUNDOWN VIDEO: The USWNT Won The World Cup, But They Lost America

The U.S. Women’s National Soccer Team won the World Cup over the weekend, but after a tournament marred by social justice controversy, the team lost America. 


Like The Rundown News on Facebook:

Support our work in Patreon: