It is difficult to argue that we do not live in a breast-obsessed society. Everyday, we are inundated with images of female breasts; Americans like to see cleavage on the red carpet, and in music videos, and at the beach. Yet, we often become prudish about breasts when it comes to nursing a baby in public. In recent years, there have been concerted efforts to protect women's legal right to publicly breastfeed (nursing is legal in all 50 states). And despite definitive studies illustrating the benefits of breastfeeding, there have been several high-profile skirmishes in which nursing mothers have been asked to leave places of business, even planes. How do you explain our country's dichotomous views of female breasts? Why is it that we like to see breasts on TV, but maybe not at a restaurant? Should there be public spaces in which women should not be allowed to nurse? Is nursing in public a mother's right? And what rights, if any, do babies have? Finally, what do you see as appropriate nursing etiquette? Is there such a thing?
Include at least one of the following pieces in your discussion:
- "Woman Kicked Off Plane for Breastfeeding Baby" (MSNBC)
- "Olive Garden Asks Breastfeeding Mom to Either Cover Up or Leave (NBC News)
- "Facebook Won’t Budge on Breastfeeding Photos" (The New York Times)
- "Peaceful Revolution: Why Breastfeeding Needs to Be Part of Health Care Reform" (The Huffington Post"
Due: Wednesday, April 14
No comments:
Post a Comment