People often assume that America is unaware of its racism, or that people don't understand how systemic injustice works; this is false. Everyone knows that America is racist and that white people benefit from systemic injustice, they just don't care, or are actively ignoring it.
I’m not saying we should stop educating people, but lack of knowledge is not as big of a problem as lack the lack of love, respect, or empathy that white Americans have for everyone else.
And I know that, as a white woman, I'm not the most qualified to talk on these issues, but I also know that as long as the locus of power rests with those who benefit from the oppression of others, and don't care about anyone but themselves, nothing is going to get any better.