Are University Diversity Staffers Rabid Anti-Semites? A New Report Would Like You to Think So.
It looks like the authors started with their conclusion and tried to make the data fit.
Serious social scientists observe some basic principles. They do not report selective or exaggerated results. They make their data available so others can evaluate its accuracy and significance. They are responsive to questions. Politicized studies, which are unreliable by nature, do not adhere to these simple conventions. A case in point is a recent report from the Heritage Foundation purporting to show rampant anti-Semitism in university diversity, equity, and inclusion (DEI) programs. Although there has been an alarming increase in anti-Semitic incidents on many campuses, the Heritage report does not support its conclusion about widespread anti-Jewish sentiments in DEI programs—a problem exacerbated by the authors’ refusal to respond forthrightly to inquiries and criticism.
According to the Heritage Foundation website, the study, titled “Inclusion Delusion: The Antisemitism of Diversity, Equity, and Inclusion Staff at Universities,” reports that “an overwhelming number of DEI hires are spewing anti-Semitic views about Israel on social media.” The authors have made this assertion in newspaper opinion essays, and it has been repeated by mediaoutlets, but unpacking the numbers shows that the study’s methodology is deeply flawed and the widely publicized conclusion isn’t supported by the evidence.
The researchers began by locating publicly accessible Twitter accounts for 741 DEI officials at 65 leading universities. Among these accounts, they identified 633 tweets about Israel, of which 605 were negative or hostile. In contrast, there were only 216 tweets about China during the same unspecified period, of which 128 were coded as favorable. From these thin data the authors, Jay Greene and James Paul, have concluded, without qualification, that “DEI staff are relatively obsessed with Israel,” and claimed to have shown how DEI staff applied a “double standard” to Israel and China.
In fact, the reported figures tell us nothing about “DEI staff” in general, or even the relative prevalence of anti-Israel sentiment in the ranks, because they do not include the number of individuals responsible for the negative tweets. The report confuses the volume of speech with the number of speakers. There may have been only a handful of rabid DEI officials tweeting hundreds of times each, while the overwhelming majority said nothing about Israel or China at all. If even twenty or thirty staffers hate-tweeted dozens of times apiece, we would still have virtually no valid information about DEI personnel as a group (including over 2000 non-tweeting staffers identified by Greene and Paul but excluded from the study).
I have made multiple attempts to obtain the crucial underlying numbers from the authors, by emailing and submitting webinar questions, but have received no replies. Greene, the lead author, has acknowledged my criticism, which was first published in The Forward, but his responses have been evasive at best. At a webinar sponsored by the Jewish Institute for Liberal Values, Greene said that the question reflected a “failure to grasp sampling techniques,” and the criticism was “akin to saying a public opinion poll is too small.” Greene repeated the survey-sample argument at a webinar for Scholars for Middle East Peace. The “important thing is not the count, but the proportion,” he insisted, and there is “no reason to think [the 741] are non-representative” of the entire group.
The problem with the study, however, is not the size of the sample, but rather the response rate. A legitimate poll provides information about not only the total number of persons contacted (by analogy, the 741 Twitter accounts), but also the number who responded to the specific question (those who tweeted about Israel, one way or another). Even assuming that 741 is a meaningful initial sample, the study cannot tell us anything about the actual incidence of anti-Israel sentiment, or even the “proportion” of anti-Israel personnel, unless we know the number of unique accounts responsible for the objectionable tweets. That information is definitely available to the authors, who collected the names and institutions of the tweeters. The authors reported the denominator, and it it would be a simple matter of counting to supply the numerator.
But the study’s methods are actually worse than that. The so-called sample is obviously subject to self-selection bias, as it was neither randomized nor controlled for confounding variables. Contrary to Greene’s confident assertion, there is no reason to believe that tweeters are in fact representative of the much larger body of DEI staffers. If anything, common sense and survey research suggest that tweeters are far more likely to hold strong opinions about political and social issues, which is a confounding variable by definition.
The comparison of attitudes toward China and Israel is likewise invalid. According to Greene and Paul, the difference in tweeting practices—condemnation of Israel versus approval of China —is proof of an anti-Semitic double standard. But that could be plausible only if both types of tweets are coming from the same people. Given the numbers in the report, however, it is entirely possible that some tweeters intensely dislike Israel, while a completely different cluster likes China. That would be evidence only of divergent interests within a diverse population, not of a discriminatory double standard held by any individual or group. I asked the authors whether there was any overlap between the anti-Israel and pro-China tweeters, but they did not reply.
No study would survive peer review with such glaring gaps in the data. Reporting only positive results is called “data dredging”; attributing group inferences to individuals is called the “ecological fallacy,” a species of the part-whole fallacy. Both are critical methodological failings. Even if the authors believe that the number of unique accounts is irrelevant, that does not explain their refusal to respond to repeated requests for basic information. Unless, of course, publication of the complete data would undermine their preferred conclusions, which at this point is the logical implication.
Greene’s most extravagant claim, presented in a webinar hosted by Scholars for Peace in the Middle East, was that “DEI hiring rewards anti-Semitism,” which is not remotely backed up by the data he disclosed, and likely contradicted by the data the coauthors have withheld. I will readily revise my critique if it turns out that a significant fraction of the 741 DEI tweeters engaged in Israel bashing, but it appears for now that the report was driven more by politics than legitimate social science, as part of the Heritage Foundation’s campaign against DEI programs and intensified emphasis on “hot-button” education debates. Greene has called for his evidence to be used “politically,” with state legislatures holding hearings over the “pattern” he discovered, and by donors pressuring universities to reduce or eliminate DEI programs.
Anti-Semitism is a real and serious problem on some campuses, as recently described in a bipartisan letter from 39 members of Congress to the Department of Education’s Office of Civil Rights, and I do not doubt its venomous presence among some cohort of DEI personnel. I have written often about the need to confront progressive and left-wing anti-Semitism, but that cannot succeed when overheated accusations are founded on manipulated data.