Halle Berry said the obvious however I give her dap and props because though she got
her Oscar back in 2002 for being the first Black actress to win leading role and since
then not another african american Actress has won for same category. yeah Hollywood
is racist no shock or surprise at all and its own still a good ole white boys club
always been that way and always gonna be that way until folks demand better
overall treatment and quality
and not a token role or token award.