Sometimes, I get the feeling our culture and society is really twisting things up. We seem to be more interested in "political correctness" than in the Truth (with a capital "T"). Who are we kidding? Ourselves...in my (humble) opinion. I think we've lost sense of reality, or something...because "political correctness" is not sustainable, y'know?