From its birth, the film industry has always been dominated by men. Despite feminist culture making its way into mainstream society, the battle for gender equity in Hollywood ensues. Hollywood is...