When the authors of a huge study with a sample size of 1.1 million people of European descent were asked about policy lessons gleaned from their study, they said: ‘None whatsoever’. At a time when funders insist that researchers show the impact of their studies on policies and programmes this blunt answer seems rather baffling. That this exchange is in a list of frequently asked questions (FAQs) longer than the study itself is another surprise.
The genome-wide association study (GWAS) examined genetic factors associated with educational outcomes, and the large sample and results showed a significant positive association. So, why did the researchers say that there were no policy lessons?
The authors themselves answer the question best in FAQ 3.7 – ‘In our view, responsible behavioral genetics research includes sound methodology and analysis of data; a commitment to publish all results, including any negative results; and transparent, complete reporting of methodology and findings in publications, presentations, and communications with the media and the public, including particular vigilance regarding what the results do –and do not– show (hence, this FAQ document).’
Knowing what to communicate and not
The researchers were clearly very wary of the repercussions of their findings being misinterpreted as genetic factors predicting education attainment. Such a misguided interpretation could lead to discrimination through genetic screening of students. Striking directly at misleading media headlines that their study might produce, the authors clarify that the study was not about ‘genes for educational attainment’. They write: ‘most of the variation in people’s educational attainment is accounted for by social and other environmental factors, not by additive genetic effects’.
Such care in linking findings to policy implications is the ethical and responsible thing to do, no matter what the area of research. We also need to be careful not to compare apples and oranges. The type of research, the questions asked and design can affect policy relevance.
But not all researchers are careful. At 3ie, we have had varied experiences. There are research reports that, rightly, avoid making policy recommendations when none seem to arise from the data analysis. However, we also see studies in which findings and policy implications don’t necessarily link to or follow from the data that has been collected and analysed.
The value of a good research communication plan
Interestingly, although the GWAS authors insisted that no policy lessons should be drawn from their study, they made sure they didn’t underplay its significance for designing future social science research. An easy-to-read and accessible article on this study in the monthly magazine The Atlantic brings in the perspectives of the authors and sector experts on how the study findings could be used. The article shows that preventing the misuse of research findings does not necessarily mean tucking them away. The research communication strategy included carefully phrasing findings and implications to prevent misuse and using appropriate communication tools to convey the message. Its robust communication strategy is what makes the GWAS a compelling example for others to emulate.
Using the most appropriate tools to reach your audience
Policy briefs are a popular and effective research communication tool, as are infographics and other products that communicate key messages in plain language and in an accessible way. However, in some cases, brief messages can leave the door open for misinterpretation. Using multiple tools for different audiences, the GWAS authors communicated research findings and implications effectively.
FAQs as an effective communication tool
A list of FAQs explaining study results could seem like asking for more of the readers’ limited time and attention. The advantage is clearer communication on takeaways from the findings, including how not to interpret them for future research, policy or programming. Questions and answers around the findings put them in perspective without the baggage of headlining direct impact on programmes or policy. However, extremely dense and technical language in FAQs could defeat the whole purpose of translating findings in ways that help understanding but that do not overreach. With the GWAS, a news article that quoted heavily from the authors and the FAQs was the most effective translation of the study for a general audience.
At 3ie, we are committed to monitoring and measuring the impact of the studies we support. We also strongly believe that high-quality research studies should carefully and correctly draw and convey the implications of findings for decision-making. Using the most appropriate communication tools to translate technical research information accurately and accessibly for different audiences is an integral part of our communication and advocacy efforts. The GWAS study shows us an underused tool that helped researchers prevent misunderstanding about their findings and promote interest at the same time.
With input from Radhika Menon.
Tags: advocacy, data, ethics, evidence, monitoring, policies, research