I've long disdained clothing in favor of wearing only what God gave me, and it seems that the more the years go by, the more I prefer to be in my natural state (naked). Clothing feels more and more restrictive, whereas naked feels more and more free, relaxing, and liberating, allowing the largest organ on my body to breathe. (That organ, of course, is my skin. What did you think I was referring to?) It feels wonderful to go about my everyday activities without the burden of clothing, and I avoid going out when possible simply because it means getting dressed. I occasionally visit a nearby nudist resort where everyone is of the same opinion, and I find it incredibly liberating and refreshing to be among people with imperfect bodies who don't feel the need to hide them. I wish we all could learn to separate nudity from sexuality, and I long for the day when I can mowe my grass or take the garbage to the curb without having to put my shorts on. It's not that I have anything to bragg about or display with pride. In fact, quite the opposite is true, to be honest. I just prefer natural nudity over man-made fabric, and I wish that we could all just get over it when it comes to seeing a human body or part thereof. Thank you for indulging my need to express my thoughts.