By Sergio | Shadow and Act May 27, 2011 at 12:29PM
Well should they?
How appropriate to follow the Clark Johnson item below with this question, revisiting an issue that I've thought about from time to time. Should black directors be obligated to direct black films? I've always been of the belief that black directors should have the right to direct any kind of film they want to, from A to Z; from family films, to porn, regardless of the makeup of the cast. Other directors, regardless of race, creed or color make all sorts of films, so why should black directors be limited to black films?
However, what if a black director makes a conscious decision not to ever make a black film for whatever reason: bigger budgets, bigger paychecks, a desire to be part of the Hollywood "in crowd", not wanting to be stuck in the "black" box, whatever; You name it. How would you feel about that? Would you consider that a betrayal? A failure to live up to their responsibilities for being a black director? One could argue that, yes they should, simply because we need all the black films we can get, to explore all facets of black life and culture. It can't be all Tyler Perry all the time.
Then again, others may argue, why should they? A film director should be free to make whatever film they want to do. Why should they have this burden of "representing the race"? No one said Ang Lee should just stick to making Asian films like Crouching Tiger, Hidden Dragon, when he also made Sense and Sensibility, Ride with the Devil, The Hulk and Brokeback Mountain.
Ok, folks lets hear you views on this matter...