After an initial exchange re this article and tweet, @epkaufm agreed to share some crosstabs of the original data with me.
below visualizes these data and lays out my criticisms of original article. https://twitter.com/epkaufm/status/1275692064982917120

It was not clear what was happening to the “undecided category”—by excluding you are reducing the denominator and thus inflating the apparent level of “Agreement.”
Levels of “disagreement” by definition were therefore also not reported by design. I instead visualize above three categories, recoded from the original seven-point scale, into "agree," "neither agree nor disagree," and "disagree categories."
6/10 liberals don’t want to change the flag/name of country; 2/10 liberals do want to change the flag/name of country is less headline grabbing, but is what the data shows.
Excluding the middle “neither agree nor disagree” would make more sense if these people had responded “don’t know”—but if we are to treat this as an ordinal scale of sorts, then the middle category is meaningful and should be kept.
Kaufmann does acknowledge these exclusions, but these caveats didn’t really make into the headline claims. That said, other items do receive more pronounced support. This is noteworthy, notwithstanding objections re external validity below.
The article was framed as evidence of liberal/“left modernist” wokeness and a coming/already-happening cultural revolution. But values for centre and right categories were all excluded.
In subsequent tweets Kaufmann suggested that differences between ideological categories were sig. at 1% level—but not clear what base category was. If “right-wing” then the base category is from a sample of 56, which is meant to denote conservative opinion in country of 300 mil.
Kaufmann himself notes the platforms he uses are not v representative of right-wing opinion. So any comparison to highly unrepresentative base category of 56 individuals is by own admission not meaningful.
I do not doubt that liberals and conservatives have different response patterns—it's just these data cannot answer this.
And other survey data (from 2018) show much more muted levels of support among Dems. for eg removing statues https://www.prri.org/research/partisan-polarization-dominates-trump-era-findings-from-the-2018-american-values-survey/
What is more, centrists have very similar responses to libs. for two of the propositions (flag and country name change). This does not fit neatly into a narrative of left wokeness (unless the centre has adopted these views as well)
We also do not know how representative these samples are for lib/centre. I do not have information on demographic composition of sample. But recent research cautions against extrapolating to population from MTurk convenience samples (even w/ large N) https://www.cambridge.org/core/journals/political-science-research-and-methods/article/recruiting-large-online-samples-in-the-united-states-and-india-facebook-mechanical-turk-and-qualtrics/C80073966548D0E94161B84504ACE001
Best practice in survey design is to pose questions at an appropriate level of generality. For the items to be “brief, relevant, unambiguous, specific, and objective.”
The survey items here are made-up proposals, and are hard to interpret—what would a new country name be that “better reflects the contributions of Native Americans”?
This also increases risk that respondents (dis)agree to a statement not because it expresses their real view but because it is cueing a broader or partisan position.
This type of “satisficing” and "expressive responding" is interesting in and of itself—but no way of determining extent here (at least with crosstab data I have).
In the article, Kaufmann writes “Some 80 percent of those who have made up their mind would replace the national anthem and constitution.” The 80 percent figure is again excluding those who neither agree nor disagree.
Re the constitution item, this is also not what the question asks and the claim is therefore untrue. The question asks about a "new American constitution." This could be interpreted as amending or, more radically, replacing.
This article has received a lot of attention. At the time of writing, the original tweet has been RT’d over 400 times. 3.2k shares on FB.
My main objections to methods are laid out above. But I think there are reasons also to object to framing. This is sold as evidence of “Orwellian,” “Maoist” “Cultural Revolution” and the willfull destruction of national heritage.
My objection is not to the survey questions being asked—I think it is worthwhile investigating these newly emergent positions and their prevalence. I am myself interested in these questions.
My frustration, which I think I share with many in academic community and outside, is that these legitimate research questions are framed in such polarising terms.
Especially given that it is often the same researchers pursuing this research who will, in the same breath, decry the overzealous and polarising rhetoric of their opponents.
I am grateful again to Eric for sharing these data with me. And I hope this thread helps makes clear the objections I have to the piece. @benwansell @sundersays @robfordmancs