Hollywood’s Woke Era Is Over. Now It’s Turning the Culture War Into Camp.


The industry seemed penned in by our political debates — until it started channeling them into wild caricatures and frothy drama.

Post a Comment

Previous Post Next Post