For many years, African-Americans have been portrayed exclusively in a negative light when it comes to television and film. The days of only playing the roles of slaves, prostitutes, drug addicts, drug dealers, and pimps are no longer. With more African-Americans owning their own media companies and creating positive content, they now control the content from in front and behind the camera.