Ever since the beginning, God has tried to give us chances to redeem ourselves and come back to Him. Being sent out of the garden (which actually was an act of compassion), the Judges, Jesus, etc., He is always trying to give us a chance to choose between Him and Satan. Without this choice, then faith is worthless. If we were forced to side with God, then we wouldn't actually be humans; we would be just emotion and intelligence without will (which is a vital part of what makes us us).
The reason I said that is because I believe that just as God gave us a choice between good and evil, we shouldn't (as Christians) impose laws against others forcing them to abide by what we assume God wants us to do. By creating laws against sins, we are removing that choice that God set up.
Libertarians believe (as I've been informed) that as long as it doesn't affect you, you shouldn't impose your will upon others. This means that things like drugs are legal (provided that under the influence of them they do not inconvenience others), however, this also means that abortion stays legal. Here comes the possibly controversial view of mine.
I think that abortion and other things should remain/become legal. We shouldn't impose our will upon others. God told us not to be the Judge of each other; He is the Judge over all of us. By making it a law that you cannot have an abortion or something else, you are taking away that free will God gave us.
We shouldn't punish others for their wrongdoings. God punishes them. Turn the other cheek and love them and leave it up to God to set them straight. You can preach to them, but if they don't accept it, don't force them to abide by what you find morally correct. Doing this has given Christians the bad name they have now. Whenever someone finds out I'm Christian, they assume I want to make everything sinful against the law and I frown upon those who do anything wrong. We have made a bad name for ourselves due to some overzealous Christians with power.
Your thoughts?