So random thought here, is the US's bad history with slavery a direct result of European colonialism? Like I'm not saying it's all Europe's fault, but Europe did export many racist things(?) over to the US. Like they brought over the institution of slavery, France invented a racist caste system for it's colony based directly on how closely descendant a person is from a European. Many plantations were growing cash crops to ship back to Europe.
I'm not trying to absolve the US in its actions, but try to understand how it got there in the first place.
Also the best way to learn the truth is to say something wrong on the Internet, so tear my musings apart I guess.