I don't understand why the Christian way of life is the only one with good values anymore? I can't tell you how many times I have heard people say that it's important for everyone to remember what "values" our country was founded on and to follow in those religious views today.
Since when did you have to be Christian to be a good person? Granted, I was raised Christian, but I certainly learned everything I know from my parents, not my church. It was more along the lines of my mom taught me right from wrong and how to behave, and then I practiced it when I went to church. She taught me please and thank you, and other mothers from church would then complement me on my good manners.
I also learned quite a bit from TV and the media. Some children's shows are actually very educational. I learned to what's nice and what's mean and how you feel about someone isn't necessarily something you should tell them.
Of course these values are valid in the Christian religion, but that's not the only place they're available. Many negative ideas are also expressed within Christianity, such as the oppression of homosexual love.
The way I feel about the love of my life is not wrong; it's more than right and wonderful than anything that I know. And if we ever have children together, I am going to raise them as far from religion as possible, but I have no doubt in my mind that they won't be any less kind and loving than either me or my lover.