Positive body image refers to an overarching appreciation and respect for one’s own body—regardless of whether it meets society’s expectations for how a body “should” look and/or function. A wealth of ...