We were taught from an early age that Lincoln was a tyrant. The slaves were treated just fine and had nothing to complain about. The Civil War was an unjustified act of aggression. Somehow...the Jews were behind it? (Yeah, I don't know, either.) And we would all be rich if the evil government hadn't taken away our sacred ancestral land. Most of us embrace those beliefs despite all evidence to the contrary.
The weirdest part is most of my relatives are descended from people who emigrated to America in the early 1900s. Those who can trace their roots in America back further are mostly the products of Louisiana coonasses (descendants of French Canadian or "Cajun" fur traders) or indigenous peoples like the Creek or Chickasaw. I can't understand why they're still so pissed off that a bunch of mostly Scots-Irish landowners had their slave labor liberated.