in North America at least every mainstream and off mainstream portrayal of sex has become so exploitative and so violent towards women that I truly feel frightened for girls growing up today who have probably literally never heard the term "make love" let alone whose boyfriends have probably never seen a single example of a loving sexual encounter.
I pray that they are being subversive and getting their true-love-includes emotions-and-respect fixes under their covers with flashlights.

