I keep seeing posts that say, basically, "Return America to its Christian Values!" What does this mean, specifically?
I don’t know what that means. I’m not trying to offend anyone…I’m a Christian…but what specifically are we supposed to return to.?
Observing members:
0
Composing members: 0
Composing members: 0