Until now.
I have written a couple of screenplays and it hit me while I was writing for my black characters. I wish Hollywood would stop demeaning black characters on film. When I go to the movies I know as soon I see a black character in the film, if he is not the lead, which you don't see often, he is going to act like a buffoon or die instantly. I've seen it in horror pictures, action movies and comedies.
Sometimes they will let the character live, but he defers to the white character to do all the heroic stuff like in the Lethal Weapon Series. I think in most movies they put black people in it because of a demographic they're trying to reach, so it's like a throw in, maybe we can get some darkies to support the movie. I feel like most black actors in movies are like that one ensign that beamed down with the Star Trek crew. If he wasn't a regular on the show he was going to die - soon. The message is clear -you are inferior and I want you to die. They really play out their hopes and dreams well on screen.
Another message they like to send out is that a black woman and man can't be together. I see Denzel Washington and Will Smith with white, hispanic or so damn near pale, we can't tell, leading ladies. I don't have anything against light skinned women but I find it odd that you rarely see a chocolate or brown leading lady.
Some people probably think this is no big deal but these images are ingrained into our minds. It's brainwashing, because if you think you're inferior, you will never achieve anything. Well, I want to let Hollywood know I hear them loud and clear and I know I have an uphill battle trying to write strong black characters who love black women, but if Tyler Perry can do it so can I.