Words matter—especially words in survey questions. They can make the difference between getting good data, bad data, and mystery data. Exhibit A: the word “uninsured.”
The Affordable Care Act (ACA) has helped millions of Americans secure health insurance. But a recent NPR poll asked about the law’s effect on the total number of Americans who have health insurance, and about half of its respondents either answered incorrectly or said they didn’t know. Meanwhile, a recent NBC/WSJ poll indicated the highest favorability ratings the ACA has seen yet. Why, then, did the NPR poll’s respondents (and others) appear not to know the law’s effect?
As I develop data products at Civis, I’m constantly digging into data not only to assess its quality and accuracy, but also to answer the question, “Does this data mean what I think it means?” Because of this focus, I pay close attention to the words people use to talk about data.
Clarity is particularly important when discussing a complex issue like health insurance coverage. I’ve seen this firsthand. We partnered with Enroll America to build a model that produced an “uninsured score,” and many people had the same question: Does a higher score mean they’re more likely to have insurance or not to have insurance? (For the record, the higher the score, the more likely not to have insurance.)
In the case of the NPR poll, I speculated that the wording of the survey question—and the word “uninsured” in particular—may have similarly confused respondents. The question read:
Since passage of the Affordable Care Act (Obamacare), has the number of UNINSURED Americans increased, decreased or stayed the same?
Stayed the same
When I read this, I had to pause to determine what it was actually asking. I’m not a survey scientist, but I hypothesized that with clearer wording, we’d find that a majority of Americans do know that health care coverage has expanded dramatically.
I took my theory to the Civis Analytics survey science team and they designed a survey to test it. We ran both our version, which omitted the pesky word “uninsured,” and the NPR version, on our weekly tracking poll. Our question read:
Since the passage of the Affordable Care Act (Obamacare), which of these is true about the number of Americans covered by health insurance?
Fewer Americans have health insurance
More Americans have health insurance
About the same number of Americans have health insurance
Our results suggest that question wording had a significant impact on a respondent’s likelihood of answering the question correctly. When we used our wording, there was a 15 percentage point increase in correct responses compared to when we used NPR’s wording.
Another indication that the “uninsured” question wording confused people is the high rate of “don’t know” responses. Respondents to our version were 6 percentage points less likely to say they didn’t know the answer. After pushing people to choose, correct responses increased by 6% for our version and 12% for NPR’s version.
These results are evidence that public understanding of the positive impact of Obamacare is likely better than the original poll suggested. This is good news for the ACA, and demonstrates the importance of adhering to best practices for survey design.
Words matter. My parents were right, but while it’s obvious I shouldn’t call my sister a “doo-doo head,” it turns out there’s a science behind writing good survey questions. Thankfully, Civis Analytics has a team of social scientists who have studied it. What’s more, they’ve packaged their best practices into our new decision application, Civis Research. We believe that everyone should have access to the best methodologies, and our hope is that Civis Research will let you easily measure public opinion about your brand without falling into common traps.
Methodological Note: Civis Analytics conducted 1,199 online interviews on January 16, 2016. Respondents were sampled from nationally representative voter and consumer files. The margin of error is +/- 3.3 percentage points.