Life Think

What Hollywood Keeps Getting Wrong About Race

“[Hollywood doesn’t] care about the way racism actually works. They just want to make racism go away….The movies have an obligation to entertain us but they also have an obligation to be fair to certain aspects of social reality because people take lessons from this stuff.”

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.