Body positivity refers to the assertion that all people deserve to have a positive body image, regardless of how society and popular culture view ideal shape, size, and appearance. A selection of beautiful images helping people build confidence and acceptance of their own bodies.