Teens Young Women Middle Life Mature Women Reference Library
What is Body Image?
your body
caring for your body
body image
gyne health
staying healthy
conditions diseases

Body image is what you perceive yourself to look like, and how that makes you feel. Your body image isn’t always related to weight, although weight is a major factor. Most women wish they could change at least one thing about their appearance, and sometimes women will become preoccupied with this desire. In fact, body image can be affected by any aspect of your appearance—from a scar to acne to the way you walk.

Body image is usually guided by the “standard” notion of beauty in your culture. There are major differences in beauty standards all over the world, and throughout time. But if you think you have it bad now, remember there have always been restrictions for beauty throughout history — sometimes less voluntary than others. In ancient China, women were forced to bind their feet to make them inordinately tiny, just because society thought small feet were beautiful. This left women crippled and hardly able to walk. Women in America often forced themselves into constricting corsets to keep their waists looking “delicate” and to adhere to the beauty guidelines of the time. Now, women starve themselves to look rail thin and get implants inserted into their chests to have the appearance of bigger breasts—the times may be changing, but women are still faced with societal pressures to look a certain way.

How you feel about your body image is entirely up to you. If you can acknowledge the effect of the media and society on your perception of yourself, you are partially on the way to accepting yourself.