Dear Evangelicals

As a follower of Jesus it's become increasingly difficult to identify as a Christian in America. The term, at this point, has to be qualified. Do I follow Jesus? Yes. But today in America, to say I'm a Christian is to say I believe a self-proclaimed pussy-grabbing man is the most biblical President our nation has ever seen. It means I live a life of profound fear of others (definitely immigrants who are probably all rapists—or so I'm told), I'm probably racist, sexist, and I definitely believe homosexuality is some kind of rare sin for which Christ's blood is insufficient to atone, so anyone who is attracted to members of the same sex is definitely less than human and should be shunned as such.

As a Christian in America, it is assumed I believe saying the word "shit" is actually worse than opening the word of God and telling someone why a particular passage means they uniquely singled out for hatred by God and doomed to an eternity of fire.

The problem is, I love Jesus and I don't buy any of that shit. Christianity in America has become so intertwined with American culture that no one here is really able to separate the two anymore. Of course Jesus was white, wealthy, lived in the South, hated Muslims, and carried a handgun—it seems to be assumed.

I think it's pretty easy to argue that's not true though.

If you live in profound hatred of gays (or anyone else who lives a life different than you) and Mexicans, if you think Jesus would have damned prostitutes instead of loving them, if you genuinely believe you've never in your life taken a handout and your taxes should be zero, or that homeless people are just people who don't know how to work hard well..... could you stop calling yourselves Christians? Because it would be really nice if the term could be associated with grace and love again.