How to Be Body Positive and Make Healthier Choices
Body positivity is a social movement that encourages men and women to embrace their body and be happy in their own skin regardless of their appearance. It was also started to help society be more understanding and accepting of larger frames. Body positivity brings a helpful perspective because no one should hate themselves or be...
By Ana Snyder, M.S., Exercise Physiology; CPT, FNS