While it would be lovely to think that reality is far more forward-thinking than fiction, the storied history of women in film and TV proves otherwise. So, you can file that under con. Pro? It means Hollywood has helped paved the way for (often long overdue) change. The screen, both big and small, has served as a conduit for introducing charged ideas about race, gender, politics, sex, power and so much more.
Granted, women in real life face real-life obstacles. But seeing women live their best, most badass lives on-screen makes you want to push past those obstacles — and we can thank trailblazing women of TV and film for that visual reminder. Despite being marginalized, having their work diminished, and, well, being pushed aside routinely throughout history, women on screen have pushed back to become pioneers of social progress. Representation matters. Seeing different versions of ourselves on screen is important. Has Hollywood dismantled the patriarchy? No. But the women who’ve been stealing the spotlight since the inception of moving pictures are sure as hell doing their best to bring the female perspective to the forefront.
So, in celebration of these leading ladies, let’s take a look at some of their most awe-inspiring moments — from groundbreaking conversations to historic award wins.
A version of this article was originally published August 2020.
Source: Read Full Article