Submitted by smesch83 t3_xzh339 in television
I'm reading a novel at the moment where it's clear that women just are not really important to the creator or the plot: there are not many female characters, they don't play interesting parts, and the author just seems WAY more interested in all of his (many) male characters.
I was wondering if there is a current TV drama or comedy that gives you these vibes: it doesn't have to be anti-feminist or reactionary or OPPOSED to women. also, it doesn't have to be something like "Das Boot" (maybe? I didn't watch it), where the setting is male-dominated.
but personally, I can't think of a current TV show that doesn't make a strong point to say "Hey: we love our female characters, we give them space and scenes to shine, and this is something that we're passionate about." is this true for ALL shows?
VeryBadDr_ t1_irm8agd wrote
WWE Smackdown.