Originally posted by Klaw
Rights come from Nature, and Governments' job is to recognize and prevent those Rights from infringement.
The Constitution specifically recognizes those rights. More specifically, the Bill of Rights does. It is the government's job to uphold the Constitution which recognizes those rights.
Oh, and they come from God, Himself.... if you don't believe in God then yeah, you could say they come from nature. But all of our founders were deeply religious and wise and rightly recognized that rights come from God.