supermax_thelion1
Active member
Random thought that I wanted to throw out here—is it just me, or does religion seem to really influence our sense of self-worth, for better or worse? Like, all the teachings about humility and goodness, or about not being "too proud" or self-centered... I wonder if this impacts how we see ourselves and what we think we deserve in life. Does anyone else feel that? How do your beliefs affect how you value yourself or handle self-esteem? I guess I’m curious if this is common or just my take on it.