I live in Georgia, a state dominated by Christians....and I frequently call them "Christians".
I live in a very nice nearly all-white neighborhood, not because it's all-white which I never noticed when I moved here, but because I like the home.
On Sunday mornings (I get up early), only the same 5 cars drive out of the subdivision, I guess to church, out of around 700 houses.
What I have been trying to ascertain in the 10 years I have lived here is just who the "Christians" are.
They all seem to be Republicans. None of my neighbors are Democrats--in fact they loath Democrats passionately.
The local businesses, the local radio stations, the local newspapers with the possible exception of The Atlantic Journal Constitution are flaming right-wing radical evangelical "Christians" with a psychopathic hatred toward Obama.
Hobby Lobby here, as well as Chik-fil-A, are openly radical right-wing "Christians" and Republican.
The Republican politicians declare their "love of Jesus" before they start spewing hatred at gays, Obamacare, Obama himself, and the general populations of non-white citizens as being lazy and dependent on free handouts.
So, is it true that Republicans and Christianity is the same?
Are liberal Democrats of the same mind as the Republican Christians?...That Christianity should be the dominating political force here and throughout the South and all of America?
Maybe those of you who are progressive liberal Democrats can explain whether Christianity and politics are not the same here.
I see it this way: Conservatives see America as a Christian country and nothing wrong with total Christian domination.
Iran sees itself as Islamic country with nothing wrong with Muslim domination.
That looks the same thing to me.
Do Democrats see Christianity as their political party also?