When it comes to discussions of American history, many on the left like to point out that the United States has a long, sordid history of brutal oppression, slavery, and genocide. Many on the right meanwhile claim claim that no other nation on Earth has done more than the United States has to protect human rights and human dignity. The tragic truth is that both sides are correct.