“[Hollywood doesn’t] care about the way racism actually works. They just want to make racism go away….The movies have an obligation to entertain us but they also have an obligation to be fair to certain aspects of social reality because people take lessons from this stuff.”
