Skip to main content

One of the first things I noticed when I walked into my supervisor’s office at Sonke was a poster on her wall. It was a subtle photograph printed on a piece of paper with muted tones and a simple message. “Stop telling women to smile,” it read. This struck me. I had never before considered a smile to be a tool of degradation or oppression. Are professional photographers oppressing their clients when they tell them to smile for the camera? Of course not.

Why, then, is it problematic when a man tells a woman to smile?

I couldn’t have answered this question if you asked me a month ago. But in the time since then, I’ve encountered this phenomenon firsthand. I was walking along Kloof Street—with a purpose, as I have grown accustomed to—when I noticed a group of three men on the sidewalk, maybe ten feet away, look over at me. Their body language and facial expressions told me that they were confused, and maybe a little curious. As I got closer, I expected them to ask me for directions or about the city, which boosted my ego because it meant that my confident stride was convincing enough to make me look like a local. I prepared myself to politely disengage. But what one man said caught me off guard.

“Excuse me, where is your smile?”

I smiled.

And then I remembered the poster.

I was angry at myself for complying so easily. I’m a self-proclaimed feminist. I’m working at a gender justice organization. How could I play so easily into something like that? Now I know why men telling women to smile is problematic. When I smiled in that instant, it was insincere. I felt used, and even traitorous to women everywhere. A woman’s smile should convey happiness and joy, not cater to the male gaze. A woman should never have to wear a skirt or a dress or a smile because a man asks her to.