I’ve written previously about recent polling on Common Core, noting that PDK/Gallup’s recent poll result on that topic is way out of whack with what other polls have found. One common argument you hear to explain this result is that PDK has a different wording than other polls. I always found this argument a little suspect, because I doubted that such hugely disparate results could be explained by the PDK wording (which, to me, seems relatively neutral).
In the 2015 PACE/USC Rossier poll, we designed questions to test the impact of Common Core poll wording on respondents’ views toward the standards. Specifically, we split our sample of 2400 four ways, randomly assigning each group one of four Common Core questions.
- To what extent do you approve or disapprove of the Common Core State Standards? (neutral wording)
- To what extent do you support or oppose having teachers in your community use the Common Core State Standards to guide what they teach? (PDK)
- As you may know, over the past few years states have been deciding whether or not to implement the Common Core State Standards, which are national standards for reading, writing, and math. In the states that have these standards, they will be used to hold public schools accountable for their performance. To what extent do you support or oppose the use of the Common Core Standards in California? (Education Next)
- A version of a PACE/USC Rossier legacy question that provides a pro- and an anti- CCSS explanation and asks respondents to pick one.
This design allows us to explicitly compare the results from wordings used in multiple national polls, and it also allows us to compare California-specific results to national figures. So, what did we learn?
First, we learned that the wording of Education Next and PDK did indeed affect the results they obtained. Using the Education Next wording, we saw support leading opposition 52/29. In contrast, using both the neutral wording (26/31) and PDK (24/27) wordings, we saw support trailing opposition . Clearly, how you word the question affects what results you get.
But second, we saw that the PDK results almost certainly cannot be entirely explained by question wording. To see how we reached this conclusion, consider the difference between the support we observed using the Education Next question and the results they saw: 52/29 vs. 49/35. Those results are quite close–just a few points difference on both support and opposition–and the difference is likely attributable to the fact that California voters are more liberal than national averages and the state has seen less Common Core controversy than some others.
In contrast, our results using the PDK wording are wildly different from the results PDK reported: 24/27 vs. 24/54. Those results are substantially different in two main ways. First, many more people offered a response to this question on the PDK poll than on our poll, suggesting more people feel informed enough to opine in their sample (probably marginal people who know quite little about the topic). Second, while the proportion supporting is the same, the proportion opposing is twice as high (!) in the PDK poll sample.
How could it be that our results differed from EdNext’s by just a few points but differed from PDK’s by 27 points? I think these results suggest that question wording alone cannot fully explain these differences. So what are the possible explanations? I see two most likely:
First, it’s possible there’s something wrong with PDK’s sample or pollster. Though Gallup has a strong national reputation, they’ve been criticized in the past by some notable polling experts. It could be that those problems are occurring here, too.
Second, there’s something about the ordering of questions that’s affecting support on the PDK poll. In particular, PDK asked 9 questions about standardized tests before they got to the Common Core question (at least, to the extent that I can discern their ordering from their released documentation). In contrast, we asked neutral right track/wrong track questions about the governor, the president, and California schools, and Education Next asked about support for schools, topics covered in schools, and school spending. Perhaps that ordering had something to do with the results.
Either way, I think these results add further support to the conclusion that PDK’s results (certainly on Common Core, but probably in general) shouldn’t be taken as the gospel. Quite the contrary; they’re an outlier, and their results should be treated as such until they demonstrate findings more in line with what we know about public opinion.
 I wasn’t expecting PDK to be as close to the neutral result as they were.