The more people know about Common Core, the less they know about Common Core

Today marks the release of the second half of the PACE/USC Rossier poll on education, our annual barometer on all things education policy in California [1]. This half focuses on two issues near and dear to my heart: Common Core and testing. Over the coming days and weeks I’ll be parsing out some interesting tidbits I’ve uncovered in analyzing results from this year’s poll.

The first finding worth mentioning has to do with Common Core support and knowledge. We’ve all read arguments like “The more people know about Common Core, the less they like it”. For instance, we see that claim from NPR, Republican legislators, and hackish tea party faux-news sites. This claim is generally based on the finding from several polls that people who say they know more about the standards are less likely to support it (or more generally, the trend that reported knowledge has increased over time, as has opposition). It turns out, however, that this may not be as true as you think.

To test knowledge of Common Core, we first asked people to tell us how much they know about the Common Core (a lot, some, a little, nothing at all). Then, we asked them a series of factual and opinion questions about the standards, to test whether they really did know as much as they said they did. The results were quite illuminating.

It turns out that people who said they knew a lot about Common Core were actually the most likely group to report misconceptions about the standards, and the group that had the highest level of net misconceptions (misconceptions – correct conceptions). For instance, 51.5% of people who said they knew “a lot” about Common Core, incorrectly said it was false that Common Core included only math and ELA standards. In contrast, just 31.7% of this group correctly answered this statement (for a net misconception index of -20). For people who only reported knowing a little about the standards, their net misconceptions were just -11 (33% misconception, 22% correct conception).

Another area on which Common Core-“knowledgable” people were more likely to be incorrect was in agreeing that Common Core required more testing than previous state standards. 57% of this group wrongly said this was true, while just 31% correctly said it was false (net misconceptions -26). All groups had net misconceptions on this item, but the margin was -19 for the “some” knowledge group, -16 for the “a little” group, and -11 for the “none” group.

In terms of raw proportions of misinformed individuals, the “a lot” of knowledge group is also the most misinformed group about the Obama administration’s role in creating the standards and the federal government’s role in requiring adoption.

In short, yes, individuals who say they know more about the standards are less likely to support the standards. But, as it turns out, that’s not because they actually know more (they don’t). Rather, it’s likely because they “know” things that are false and that are almost certainly driving their opposition.

So the next time you see someone claiming that “the more people know about Common Core, the less they like it,” feel free to correct them.


[1] Part 1 of this year’s poll was already released–it focused on Local Control Funding and overall attitudes toward education in California. You can read more about it here.

Advertisement

On Common Core, can two polls this different both be right?

It’s everyone’s favorite time of year! No, not Christmas (though this lapsed Jew increasingly finds the Christmas season enchanting). It’s Education Poll Season!

A few weeks ago we had Education Next’s annual poll. Yesterday was Phi Delta Kappan/Gallup. And over the next couple weeks there will be results from the less heralded but no-less-awesome poll put out by USC Rossier and Policy Analysis for California Education [1]. It’s great that all of these polls come out at once because:

  1. It’s so easy to directly compare the results across the polls (at least when they ask similar enough questions).
  2. It’s so easy to spot hilariously (and presumably, maliciously) bad poll-related chicanery.

In today’s analysis, I’m going to discuss results from these and other polls pertaining to public support for the Common Core standards. I’ve done a little of this in the past, but I think there are important lessons to be learned from the newest poll results.

Finding 1. Support for the Common Core is probably decreasing. Education Next asked about Common Core in the same way in consecutive years. Last year they found a 54/26 margin in favor; this year it was 49/35. PDK asked about Common Core last year and saw 60/33 opposition; this year it was 54/24. In both cases the opposition margin has increased, though not by much in PDK. The PACE/USC Rossier poll will add to this by tracking approval using the same questions we have used in previous years.

Finding 2. Voters still don’t know much about Common Core. In PDK, 39% of voters reported having heard just a little or not at all about Common Core (I’m also counting “don’t know” here, which seems to me to have a very similar meaning to “not at all”). In Education Next, 58% of respondents did not know whether Common Core was being implemented in their district, an even more direct test of knowledge. While neither of the polls this year also asked respondents factual questions about the standards to gauge misconceptions, I’m quite confident they’re still high given what polls found last year. The PACE/USC Rossier Poll will add to this by testing the prevalence of a variety of misconceptions about the standards.

Finding 3. Folks continue to like almost everything about Common Core other than the name. For instance, Education Next finds that voters overwhelmingly support using the same standardized test in each state (61/22), which aligns with the federal government’s efforts in supporting the consortia to build new assessments. Voters also are quite favorable toward math and reading standards that are the same across states (54/30). Finally, PDK finds that voters are much more likely to say their state’s academic standards are too low (39%) than too high (6%), which supports the decisions states are making with respect to new Common Core cut scores.

Finding 4It seems likely that the wording of Common Core questions matters for the support level reported, but we don’t have enough good evidence to say for sure. Education Next was criticized last year for the wording of their Common Core question, which was

As you may know, in the last few years states have been deciding whether or not to use the Common Core, which are standards for reading and math that are the same across the states. In the states that have these standards, they will be used to hold public schools accountable for their performance. Do you support or oppose the use of the Common Core standards in your state?

The question was criticized for invoking accountability, which most folks are in favor of. Because the folks at Education Next are savvy and responsive to criticism, they tested the effect of invoking accountability, asking the same question but without the “In the states …” question and found support fell to 40/37. Though PDK was criticized last year for their question, they appear to have stuck with the same questionable item. The PACE/Rossier poll directly tests both the 2014 PDK and Education Next questions, plus two other support/opposition questions, in order to clearly identify the impact of question wording on support.

Finding 5. As compared to every other reasonably scientific poll I’ve seen that asks about Common Core, PDK produces the most extreme negative results. Here are all the polls I have found from the last two years and their support/opposition numbers (sorted in order from most to least favorable):

Public Policy Institute of California 2014 (CA): 69/22 (+47)

Education Next 2014: 54/26 (+28)

NBC News 2014: 59/31 (+28)

Public Policy Institute of California 2015 (CA): 47/31 (+16)

Education Next 2015: 49/35 (+14)

Friedman Foundation 2015: 40/39 (+1)

University of Connecticut 2014: 38/44 (-6)

PACE/USC Rossier 2014 (CA): 38/44 or 32/41, depending on question (-6, -9)

Louisiana State University 2015 (LA): 39/51 (-12)

Monmouth University 2015 (NJ): 19/37 (-18)

Times Union/Siena College 2014 (NY): 23/46 (-23)

Fairleigh Dickinson 2015: 17/40 (-23)

PDK 2014: 33/60 (-27)

PDK 2015: 24/54 (-30)

Only one other national poll in the past two years comes within 20 points (!) of the negative margin found by PDK – anything else that’s that negative comes out of a state that’s had particularly chaotic or controversial implementation. Now, it could be that PDK’s results are right and everyone else’s are wrong, but when you stack them up with the others it sure looks like there’s something strange in those findings. It might be the question wording (again, since PACE/USC Rossier is using their exact wording, we can test this), but my guess is it’s something about the sample or the questions they ask before this one. This result just seems too far outside the mainstream to be believed, in my opinion.

Finding 6. The usual suspects of course pounced on the PDK poll to score points. Randi Weingarten used the results on Twitter to make some point about toxic testing (the use of a buzzphrase like that is a pretty clear sign that your analysis isn’t so serious). At the opposite end of the spectrum (which, increasingly, is the same end of the spectrum), Neal McCluskey said the results showed Common Core was getting clobbered (though, to his credit, he questioned the strange item wording and also wrote about Education Next last week, albeit in a somewhat slanted way).

So there we have it. Common Core support is down. But if you don’t call it Common Core and you ask people what they want, they want something very Common Core-like. They still haven’t heard much about Common Core, and most of what they think they know is wrong. And they almost certainly aren’t as opposed as PDK finds them to be. That’s the state of play on Common Core polling as of now. Our poll, coming out in a couple weeks, will address some of the major gaps described above and contribute to a deeper understanding of the support for the standards.


[1] Disclosure: Along with David Plank and Julie Marsh, I’m one of the three main architects of this poll.

Tests are the worst! Or the best! No, the worst!

A new Quinnipiac poll is out today. As always, I think it’s best to take these polls not as single data points in favor of one particular position, but rather as part of a broad sea of often contradictory, incoherent evidence about what/whether the public thinks about education.

There are some interesting nuggets in here, and again fodder for both “sides” of current education reform debates. The teachers’ unions and their supporters will love that the poll finds voters support the teachers’ unions’ policies over Governor Cuomo’s by a substantial margin (note that Cuomo’s overall favorability rating is net positive, so the lack of support for his education policies is particularly strong; that said, I wonder how much people understand what his education policies even are). The reformsters will love that a majority thinks the number of charter schools in the state should be expanded. Nothing new here; support for charters in polls is almost always net positive.

What’s most interesting to me, though is a series of questions about standardized testing. To me, these questions make painfully apparent the utter lack of coherence (or, to put it much more charitably, the nuance) in the public’s views of testing. First, we have the question “Do you think teacher pay should or should not be based on how well their students perform on standardized tests?” The results here are a resounding NO, 69/28. Similar results for whether standardized tests should be used for teacher tenure. [1]

Then we have the question “How much should these tests count in a teacher evaluation: 100%, 75%, 50%, 25%, or not at all?” Now, you would imagine these results would be mostly “not at all,” since the very previous two questions folks said the results shouldn’t be used for pay or tenure. Nope! In fact, 49% of people say these tests should count 50% or more in teacher evaluation, and another 27% say 25%. Just one-fifth of respondents–21%–say not at all. Hardly an anti-test bunch, these voters.

And finally, we have the question “Do you think standardized tests are or are not an accurate way to measure how well students are learning?” At this point I guess you’d have to think that voters would say yes, since in the immediately preceding question 77% said these tests should count for teacher evaluation. But you’d be wrong again! 64% said that standardized tests were NOT an accurate way to measure how well students are learning.

So, tests are not an accurate way to measure student learning, but they should definitely count at least a quarter in teacher evaluations, but they shouldn’t count at all in tenure or pay decisions. Got it. Suffice it to say this is yet another example showing why it’s immensely problematic when people pick a single data point from one poll and use it in support of their existing position.

[1] Jacob Mishook, on Twitter, notes that these wordings could be construed to imply 100% reliance on standardized tests for these decisions, which is a fair point that might explain at least part of the very negative response.

Everyone’s got an opinion about everything

Okay, not everyone. And not everything. But surprisingly many people about surprisingly many things. This will be the first of many posts about public opinion polling data, something in which I have increasing interest (even if little technical expertise).

Today’s interesting nugget comes via NPR, which reports on a recent little exercise done by Public Policy Polling. It seems that after a random tweet from a TCU professor, PPP polled voters and found that they had stunningly negative views of this person (whom they could not possibly have heard of)–3% favorable to 20% unfavorable. The money quote:

The big lesson for Farris, who is already thinking about how she’ll work this experiment into her next political science class, is in “pseudo-opinions.”

“People will offer an opinion when they don’t actually have one,” she said. “There is social pressure to answer, and give some type of opinion, whether it’s right or wrong.”

There is a recent boom in public opinion polls on education, and I am willing to bet many of the same trends come into play. Despite their general lack of knowledge about education issues, Americans want to give their opinions. In particular, for example, polls suggest that Americans pretty strongly support local control and teachers while also supporting weakened labor protections and testing. I’m sure some of this support is real. But I’ll bet a good chunk of it is just pseudo-opinions. Hopefully well-crafted polling and research can be used to help discern the difference.