
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
- Publications
- Account settings
- Advanced Search
- Journal List
- v.170(9); 2004 Apr 27


Science reporting to the public: Does the message get twisted?
Lay people get a substantial amount of information about health and related topics from the media. Communicating with the public through the media can be vexing for medical professionals because they lack direct control over the final reporting. It is the reporter's framing of the information and his or her words that reach the public, rather than the scientist's or the clinician's. Moreover, there is a mismatch between the expectations of the scientist and those of the reporter. From the medical researcher's perspective, the news should be a sort of double helix, with media reports correctly matching, letter for letter, the original scientific publication. But this ideal is unachievable because the public cannot understand the language of an increasingly subspecialized scientific enterprise. Even scientists have trouble communicating across subdisciplines. Given that the ideal cannot be realized, what standards should be used for judging reporting about medical science and how well is current reporting meeting those standards?
The standards of accuracy applied in the popular press are set by the need for reporters to translate the precisely honed technical descriptions found in scientific writing into lively and clear summaries, using lay vocabulary. At best, this standard includes fidelity to sources, a balance among and inclusion of different viewpoints, and a translation that conveys some main idea from a study clearly and without the kind of distortions that might encourage inappropriate (or even dangerous) behaviour or unrealistic expectations.
Given these differences in expectations and the lack of personal control, it is not surprising that medical researchers are quick to blame the media for problems with public communication and to assume that media coverage is characterized by inaccuracy. However, the research that Tania Bubela and Timothy Caulfield 1 present in this issue (page 1399) suggests that these assumptions are not well founded. Their results indicate that media reports are reasonably accurate, except in specific types of controversial areas, and that cases of inaccuracy may be as much a product of the researcher's overenthusiasm as of error by the reporter. These findings are consonant with related research.
Early social scientific research on genetics reporting tended to assume that the media were “getting it wrong,” 2 but it soon became clear that anecdotal and sweeping assessments were subject to observer bias. It was all too natural for critics to notice and reprint examples of egregious reporting, portraying these as typical of the rapidly burgeoning area of genetics reporting. Similarly, these critics applied their own assumptions — usually that any favourable reporting about genetics was undesirable — to condemn all reporting about genetics as bad, simply because much of it contained favourable elements. More recent research has used quantitative measures, paid attention to sampling and generalizability, used standardized and accepted measuring instruments, checked coder reliability and used increasingly sophisticated analytic methods such as the classification and regression tree analysis applied in the study by Bubela and Caulfield. 1 Such research highlights that studying media coverage of genetics is as complex as the genetics itself, for communication is not a simple, linear process.
Bubela and Caulfield explicitly address the question of the fidelity of news reports to the scientific reports on which they are based. They find that disagreement between scientific findings and media reporting is rare. It might well be, however, that both journalists and scientists offer an exaggerated vision of the prospects of genetic medicine. The methods used by Bubela and Caulfield would not define such congruent exaggeration as inaccuracy. But, using different methods and a different sample, Mountcastle-Shah and associates 3 found a similar level of inaccuracy and noted that exaggeration by the media occurred in only a minority of news reports.
Another possibility is that press coverage could be faulted for being “unbalanced.” That is, even though it might not exaggerate wildly or contain blatantly inaccurate statements, it might have a pro-genetics or anti-genetics slant. In fact, an overly optimistic slant has been detected in most studies. 4 , 5 , 6 , 7 This lack of balance includes a tendency to quote from scientists more than from other sources 4 , 5 , 7 and a failure to include topics such as potential risks 2 or specific ethical considerations. 8
In addition to difficulties in defining the criteria by which press coverage of science should be assessed, there is evidence of differences among specific newspapers and across specific topics, as Bubela and Caulfield's data suggest. 1 Differences have also been found between different types of stories, coverage by different media and coverage published at different times: generally, “hard” news reports are more accurate than feature stories, 9 print media are more accurate than television, 10 and later coverage is more accurate than earlier coverage. 5 , 7
The use of exaggeration or slant (whereby some features are ignored and others are overemphasized) may be motivated by the conflicting responsibilities faced by both scientists and journalists, as a study by Wilcox 11 has made clear. She noted that reporters need to gain newspaper space (and ultimately an audience) for their topics, so they are prone to include sensationalistic, absolutist or at least dramatic statements. This drive conflicts with the norms of science journalism, which encourage cautious, detailed, balanced reporting, thus reflecting the norms of the science that is covered. Such conflicts may become evident as inconsistencies in coverage. In the interview setting, these dynamics are often manifested as reporters' efforts to get researchers to state the practical implications of the research, and the subsequent tendency to portray these applications as more general and perhaps more immediate than they may well be. 2 Similarly, researchers may be influenced by conflict between their responsibilities to the norms of science and their desire for academic promotion, grants for the maintenance of research staff and laboratories, or simply personal financial gain though patents and royalties. 12 They may also be under pressure from commercial sponsors of their research.
What then does the line of research represented by Bubela and Caulfield's study mean for medical researchers trying to communicate with the public about health? First, researchers should assume that what they say in their scientific publications may make its way into the public sphere and that press coverage may treat speculative “discussion” sections as fact. Second, researchers should talk to reporters. The public, who fund research with their hard-earned tax dollars, and patients, who indirectly fund research by purchasing products developed by private industry, have a right to know about that research, and news reporters are a major conduit. But researchers must prepare for such interviews as carefully as they would prepare for a talk at a scientific conference. The researcher should know exactly what she or he intends to communicate to the reporter and should resist the reporter's efforts to gain commentary that is different from what the researcher wishes to communicate. Detailed guidance about what is desired by lay audiences, as agreed upon by scientists and reporters, is provided by Mountcastle-Shah and associates; 3 noteworthy in their catalogue are replication status, opinion of outside experts, prevalence of the genotype and phenotypes, and symptoms of disease. It is perfectly reasonable to ask a reporter to send written questions in advance of the interview. When that is not possible, the researcher should at least be prepared to avoid answering the reporter's inevitable question — “What is this good for?” — with an enthusiastic forecast of potential applications. Given the dynamics of science reporting, the reporter will probably feature rosy forecasts if the scientist is willing to offer them, yet such forecasts may all too often come to be seen as broken promises. If it is appropriate to link the research to the development of medical treatments, supplying colourful metaphors that emphasize the distance from the ultimate goal may be a good way to do so. Thus, the researcher might say that this is one baby step on the long journey toward the cure for X or that it is one tiny piece in the giant jigsaw puzzle that might someday enable prevention or treatment of Y. Finally, if a reporter exaggerates or is inaccurate in the final story or broadcast, the researcher should let the reporter's editor know and should tell other researchers about the experience with that particular reporter.
An old communication aphorism has it that one “cannot not communicate.” This is as true for medical researchers as it is for others; the corollary is that they must act responsibly in communicating with the media.
β See related article page 1399
Competing interests: None declared.
Correspondence to: Dr. Celeste Condit, Department of Speech Communication, University of Georgia, 110 Terrell Hall, Athens GA 30602; fax 706 542-3262; [email protected]

How Should Journalists Report a Scientific Study?
- Markkula Center for Applied Ethics
- Ethics Resources
- Ethics Blogs
- All About Ethics

A three-step process and a framework of questions to make ethical reporting decisions, with recent convalescent plasma reporting as an example.

Scientific Evidence Image Binkley Vincent
Charles Binkley is the director of Bioethics, and Subramaniam Vincent is the director of Journalism & Media Ethics, both with the Markkula Center for Applied Ethics. Views are their own.
If you are a journalist reporting on COVID-19, you may be running into scientific studies, reports, and findings on a whole host of topical matters: vaccines, treatments, epidemiology, and more. From March this year, findings emerged every week and month from studies on the propagation of the virus to antiviral treatments to vaccine trials and more. The recent reporting on the convalescent plasma trials are a good example. How do you assess the value of the scientific evidence so that your story does not overstate or understate the importance of the findings or their relevance?
To start off, let’s look at how doctors keep pace and assess levels of evidence in medical literature in general. There are 30,000 medical and scientific journals in the world publishing over two million articles annually . Not all the reports that get published have any relevance for actually treating patients. The goal of medical research is to improve the health of society. Two ways of doing this are by extending life (or preventing early death) and improving quality of life (or decreasing human suffering). While some of the studies that get published in medical journals actually affect how clinicians treat diseases, most do not.
But COVID-19 has recently thrust upon society the raw data from several important medical studies. What is more, there are likely many other related studies coming in the future. In the past, these results weren’t usually of much interest to the general population. Doctors would read and evaluate the studies, then decide if the results should be translated into clinical practice. This is called evidence-based medicine .
Because of the huge societal benefit that current COVID-19 trials may hold however, results are now being reported to the public at the same time that doctors are given the data to review. Since not everyone has the training or experience to decide what should be implemented and what should be ignored, it is important to understand the different kinds of trials and the weight they hold in guiding medical decisions.
The important thing about a medical study is that doctors want to make sure that the effect that they observe is absolutely, positively due to the intervention that they initiated and not some other factor. Some studies are better at ensuring this cause/effect relationship than others. Applying this to journalistic work, here is a basic framework for evaluation of medical evidence, all assessed based on how solid the cause/effect relationship is.
LOW : Studies with the lowest quality data are not very likely to ever be translated into clinical practice. The lowest quality study is based on expert opinion or a limited number of cases. It basically says, “I am an expert in this area and I did this thing to some patients and this is what happened.” These are not very convincing studies. Imagine a study in which convalescent plasma was given to three not very sick patients with COVID-19 and they all survived. Or they all died. Or the patients were all very sick, and they all recovered. Or they all died. It is hard to know what that means.
HIGH : Studies with the highest quality data are often a major driver of medical decisions. They are double blinded, randomized, controlled trials (RCT)—often referred to as the gold standard of studies. Patients are randomly assigned to receive either the therapy being studied, or the currently accepted best treatment. If there is no current treatment for the disease being studied, the treatment is compared to a placebo like sugar or salt water. The study is double blinded, meaning that neither the patient, physician, nor researcher knows who is getting what until after the study ends. A good RCT minimizes the variables so that doctors can be sure that the effect they observe is due to the intervention they are studying.
MIDDLE : Findings from this category require consistent observations across multiple studies, each conducted independently, before they are given serious consideration for clinical implementation.
When reviewing a study, journalists need to determine which bucket a study fits into. The advantage of this approach to evaluate medical science findings is that it helps journalists build news judgment around a vocabulary shared with doctors. And that means scientists will recognize the questions journalists ask, and in turn their responses can then be interpreted by reporters in the same shared context.
Walkthrough of the convalescent plasma example
This played out with the convalescent plasma trial for COVID-19 . Not only was the benefit of convalescent plasma for patients with COVID-19 difficult to determine from the trial, the trial’s true importance may have been overstated by government officials. This can leave the public wondering what to, and what not to, believe from the government.
Patients with COVID-19 were transfused with plasma from patients who had COVID-19 previously and recovered. There were a lot of variables: when in their illness patients were transfused, how sick the patients were when they were transfused, and how many antibodies the transfused plasma contained. The study neither compared patients who received convalescent plasma to patients who did not receive it, nor did it decide in advance to randomly assign similarly sick patients to either receive plasma or not.
The study enrolled over 35,000 patients, which is great. Say that they had randomized half of those patients to receive plasma from people who had recovered from COVID-19, and half to receive plasma from patients who had never had COVID-19. Also, let’s say that all the patients were equally sick and the level of antibodies from the COVID-19 patients was the same in every dose administered. Finally, and essential to prevent bias, neither the patients, nor researchers, nor clinicians, knew which patient was getting which kind of plasma. The results of that kind of trial would carry a lot of weight in making decisions about whether or not convalescent plasma was beneficial.
The middle bucket needs careful parsing
All other studies are somewhere between the lowest and the highest levels. Like the gold standard RCT, these studies will initiate a treatment and then look for an effect. They differ in two important ways: randomization and bias. In these other studies, it is not decided in advance which patients will get which treatment. Rather, the patients are treated, usually based on the discretion of their physician, and an outcome is observed. If there is a “control” or comparison group, it will be a group of similar patients, selected after the fact, who did not receive the treatment being studied. Again, this is usually because their physician did not order or recommend the treatment. Finally, since decisions about giving the treatments were made by individual physicians, no one can know all the facts that went into that decision.
Clearly, there is more of a risk that these middleweight investigations, mostly composed of case-control and cohort studies, do not establish a true cause and effect relationship. However, the more of these types of studies that have similar findings, the more convincing the findings are. They will never be as good as a RCT, but the more well planned case-control and cohort studies that demonstrate the same result, the more likely there is a cause and effect relationship, and the more likely the intervention is to be accepted in medical practice.
Here are some questions journalists can ask to frame their inquiry and next steps:
1. What is the level of medical evidence in this study? (Data). Does it fall into the LOW, HIGH, or MIDDLE categories? Ascertain this with the scientists who published it.
If LOW, ask these questions, both of the scientists who issued the findings:
- Why is study significant now? How does it move the needle?
- What questions does it leave unanswered, according to scientists themselves?
- What are the scientists planning to do next?
If HIGH, ask these questions
- What impact will this study have on direct patient outcomes? (survival, quality of life, cost savings)
- Is this a new and revolutionary finding or one that is accepted and just needed to be validated?
- Will this study change the current practice and how?
If it’s in the MIDDLE (i.e. it’s neither LOW nor HIGH), ask these questions
- Do your findings support or challenge the findings of other investigators exploring the same issue?
- What further study is needed in order to translate your findings into clinical practice?
- How would you summarize your findings and their importance to the lay reader? Would you caution how the lay reader interprets the findings?
2. Next, corroborate your answers with what other medical professionals say. These would be peers of the scientists who have issued the findings you are considering reporting on. This is a good way to let medical professionals weigh on the level of evidence in a specific way that helps public education.
3. Use these answers to assess the findings’ newsworthiness in the public interest. If this study has been overstated in some other media outlet, the answers will help you decide whether you must do a story on a “debunking frame.” Overall, decide whether there’s even a need to be reporting a LOW or MEDIUM evidence quality study. How does it help? If you do report, explaining the three levels of evidence in medical literature and where this study fits would be helpful to the reader for its educative value. This is also transparency about your method of evaluation in action.
4. If the study falls into HIGH as corroborated by other medical professionals, you have a story of significance.
Subscribe to Our Blogs
- Benison: The Practice of Ethical Leadership
- Center News
- Ethical Dilemmas in the Social Sector
- Internet Ethics: Views from Silicon Valley
- Internet Ethics: Views from Silicon Valley
- Ethics & Leadership
- Fact-Checking
- Media Literacy
- The Craig Newmark Center
- Reporting & Editing
- Ethics & Trust
- Tech & Tools
- Business & Work
- Educators & Students
- Training Catalog
- Custom Training
- For ACES Members
- For Educators
- Professor’s Press Pass
- All Categories
- Broadcast & Visual Journalism
- Fact-Checking & Media Literacy
- In-newsroom
- Memphis, Tenn.
- Minneapolis, Minn.
- St. Petersburg, Fla.
- Washington, D.C.
- Poynter ACES Introductory Certificate in Editing
- Ethics & Trust Articles
- Fact-Checking Articles
- CoronaVirusFacts Alliance
- CoronaVirusFacts Database
- Teen Fact-Checking Network
- International
- Media Literacy Training
- MediaWise Resources
- Ambassadors
- MediaWise in the News
Support responsible news and fact-based information today!
Journalists tend to understate — not exaggerate — scientific findings, study finds
The study compared paper abstracts with corresponding news articles and found that journalists generally expressed more uncertainty in their writing..

Science journalism has seen a resurgence during the pandemic. Every day seems to bring a new study, a new paper, a new finding to break down and interpret for readers.
But how well do journalists communicate those findings?
A study from the University of Michigan found that journalists tend to understate the claims of scientific papers. As part of their research into how certainty is expressed in science communication, School of Information Ph.D. student Jiaxin Pei and assistant professor David Jurgens compared hundreds of thousands of paper abstracts with corresponding news articles reporting on those papers’ findings.
“The findings presented in the science news are actually lower than the certainty of the same scientific findings presented in the paper extracts,” Pei said.
Pei and Jurgens found that journalists can be very careful in their science reporting, sometimes downplaying the certainty of findings. Their research contradicts claims that journalists exaggerate scientific discoveries.
To reach their conclusions, Pei, Jurgens and a team of human annotators calculated certainty levels in scientific abstracts and news articles pulled from Altmetric , which tracks news stories mentioning scientific papers. They then built a computer model that could replicate these calculations, allowing them to analyze hundreds of thousands of articles and papers.
Those discrepancies became especially evident when they analyzed different aspects of certainty, such as “number.” For example, their results, along with prior research, indicate that journalists may be replacing precise numbers found in scientific papers with language like “roughly” to make their writing more accessible.
While they don’t have definitive answers as to why journalists understate scientific findings, Jurgens hypothesized that one reason may be that reporters believe it is better to err on the side of caution. He notes that journalists’ work can be difficult. They must translate scientific work so that it is comprehensible to a general audience.
While some believe that overclaiming scientific findings is worse than underclaiming them, Jurgens said the latter can also have negative effects. He pointed to reporting on COVID-19 vaccines as an example.
“Scientists are fairly certain that the vaccines are safe,” Jurgens said. “But I think bringing up the uncertainty regarding that could lead people to be less vaccinated or to maybe not seek out health care. In this case, within the pandemic setting, it could mean a loss of life, which is a fairly serious outcome.”
Pei and Jurgens also examined how “journal impact factor” — their proxy for measuring quality of science — affects the way journalists present scientific conclusions. They found that the journal where a study originated did not seem to influence how reporters described scientific uncertainty.
That can be a problem, Pei said, since higher impact journals have a more rigorous reviewing process. Knowing how prestigious or trustworthy the journal a scientific paper is published in could be useful information for readers.
For journalists who wish to improve how they describe scientific findings, Pei advises that they talk to the scientists behind the study they are trying to cover. Jurgens noted, however, that scientists can also be “pretty bad communicators.”
“It’s an open question,” Jurgens said. “How do we effectively communicate this in a way that is accessible?”
Asked how certain they were about their own results, Jurgens and Pei said they were “pretty certain.” The model they built produced certainty levels that were very similar to the ones calculated by the annotators, and their analysis included hundreds of thousands of data points. Their paper was published in the Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing .
Pei and Jurgens noted, however, that they could always use more data and that their study didn’t look at other areas where people might perceive exaggeration in news — such as headlines.
The next step in their research is to talk to journalists and determine what tools they could use to improve their reporting. They have been thinking about ways to help reporters translate scientists’ work for general audiences.
One step Pei and Jurgens have already taken is publishing code that allows journalists and scientists to calculate certainty levels in their writing.
“There are a lot of open questions in this field (natural language processing)” Pei said. “With more efforts in this area, we’ll be able to provide tools and systems for journalists to cover science.”

Protected: Autopost 2 – Saturday

Protected: Autopost 1 – Saturday

Opinion | The new ‘Succession’ trailer, along with media news and must-read links for your weekend
It’s now official: This will be the last season of the brilliant series about a media mogul and his family.

Is there really a shoplifting ‘epidemic?’
Plus, whether stronger penalties slow retail theft, the deal with all the near-misses on airport runways, and more.

Protected: Autopost 2 – Friday
Start your day informed and inspired..
Get the Poynter newsletter that's right for you.
- Give this article Give this article Give this article
Advertisement
Supported by
Times Insider
How Times Reporters Handle Scientific Studies
When is research considered reliable? The answer isn’t always fully known. Here’s the approach our journalists take in evaluating studies and their results.
Send any friend a story
As a subscriber, you have 10 gift articles to give each month. Anyone can read what you share.

By Emily Palmer
Times Insider explains who we are and what we do, and delivers behind-the-scenes insights into how our journalism comes together.
After early studies showed promising results of the anti-malarial drug hydroxychloroquine in coronavirus patients, President Trump quickly promoted it as a possible treatment and later announced that he was taking the drug as a preventive measure.
But publishers of the study from France that Mr. Trump had referenced said that it fell short of their standards , while researchers in Brazil examining the related drug chloroquine halted their research after patients given high doses developed potentially fatal heart arrhythmias. Throughout its coverage, The Times has cited scientists’ reservations about the drug’s effectiveness and reported last week that the first controlled study of the drug found it did not prevent infections in people exposed to the virus.
The misjudgments about hydroxychloroquine underscore the importance of how reporters at The Times cover scientific research: They study the study, and tell readers what’s known and what’s not. Even as scientists work feverishly to answer questions about the pandemic , science reporters carefully parse fact from conjecture, truth from folly.
That’s necessary now especially because “there’s a flood of new science, much of it seat-of-the-pants,” said Celia W. Dugger , the health and science editor.
More on the Coronavirus Pandemic
- Lab Leak: New intelligence has prompted the Energy Department to conclude that an accidental laboratory leak in China most likely caused the pandemic, though U.S. spy agencies remain divided over Covid’s origins.
- New Drug’s Long Odds: A promising new treatment quashes all Covid variants, but regulatory hurdles and a lack of funding make it unlikely to reach the United States market anytime soon .
- Dangers Remain for Seniors: For older Americans, the Covid pandemic still poses significant threats . But they are increasingly left to protect themselves as the rest of the country abandons precautions.
- N.Y.C.’s Mandate: New York City will end its aggressive but contentious vaccine mandate for municipal workers, Mayor Eric Adams announced, signaling a key moment in the city’s long battle against the pandemic.
Before you read about a study in The Times, reporters will have looked into the researchers’ backgrounds and often consulted three to five outside experts to determine the quality of the work. Reporters also ask questions like: What are the margins of error? Did the study include enough patients to get meaningful results? And what are the shortcomings of the research?
Historically, reporters have considered studies published by major science journals — like Nature, The Lancet and The New England Journal of Medicine — to be the most reliable, because those publications use expert editors and rigorously vet research methods and conclusions by sending them to other scientists to evaluate.
But last week, both The Lancet and The New England Journal of Medicine retracted big coronavirus studies because they were based on data that could not be verified. The Times had reported on one of those studies .
Cases like this highlight the need for vigilance. Before citing such research, Times reporters will generally seek out independent experts to comment. That’s critical now because during the pandemic, the length of time from experiment to published study has been “short-circuited,” said Mike Mason , a deputy science editor, adding that what once took as long as a year and a half has been shortened in some cases to weeks.
Now many researchers post their work online in what is known as a preprint , or a study that gets released without the standard practice of peer review. Some preprints contain observational or anecdotal work that doesn’t meet more exacting scientific standards like randomly selected patients or control groups enabling more definitive conclusions. Times reporters who write about preprints try to make clear to readers what the research shows and what it leaves unanswered.
Those outside the field have rarely paid attention to these preprint studies — until now. Politicians and everyday citizens sometimes look to studies in their infancy.
Reporters are not looking for the perfect study — science is not definitive and even the highest quality research has limitations. But “right now, with all these reports pouring out, there’s more uncertainty than usual,” said Denise Grady , a science reporter. “That’s hard for everybody to accept at a time when we all wish there were answers and a clear way forward.”
The Times’s guidelines recognize that reporters must balance the weight that they give to such studies with the need to provide information about those that wind up in the global conversation.
“We will do everything we can to be straight with readers about what we know and what we don’t know,” said Ms. Dugger. “Science isn’t the discovery of a final truth or the be-all-end-all. It’s a process. Our knowledge will get better as we go along, but it’s still worth sharing with people the evolution of what we know.”
Shaping the Future of Global Journalism
Search Site
Tips for incorporating scientific research into your reporting.
Research studies, crisis reports and documents from experts are important resources for providing well-researched trends that explain complex global crises.
However, research records can often be lengthy, boring and difficult for reporters to transform into engaging stories, but this doesn’t have to be the case.
“I remember looking around the press briefing room at last year’s COP26 when a climate crisis report was unveiled. The complex graphs bore many journalists, and that’s not uncommon,” said Paul Adepoju , the community manager of the ICFJ’s Global Crisis Reporting Forum.
“Journalists are often unsure how to handle such reports beyond the executive summaries,” he acknowledged.
As a science journalist and contributor to Nature Africa , Adepoju said he learned how to translate complex and technical scientific studies into simple news stories beyond newsletter executive summaries.
Adepoju spoke with Akin Jimoh , the editor in chief for Nature Africa, during a recent ICFJ Global Crisis Reporting Forum webinar entitled “Transforming studies and reports into top news stories.” Jimoh provided tips for reporters on how they can utilize research studies to produce stories that can be easily understood by readers.
Why science reporting is important
The COVID-19 pandemic revealed that science reporting and the public’s understanding of it is essential, Jimoh noted.
As a former Knight Science Journalism fellow at the Massachusetts Institute of Technology , Jimoh gained experience translating science to the public. As a fellow, he used his journalism background to report on and incorporate scientific reports into his articles.
“Whichever way you look at it, scientific journalists are the ones who bridge the gap between those who do not understand something and making it understandable to others,” he said.
“One tool that has been useful for me is called EurekAlert, which sends alerts on scientific news, research materials or press statements to institutions that are subscribed to it. It is advisable for journalists to be on their mailing list as they also send embargoed newsletters so that one is able to plan their reporting properly,” said Adepoju.
Jimoh said that science journalism is a special kind of reporting that requires extra care. “We need to interpret the science, we are the go-between the science and the information that the public has to know and understand,” he said.
However, science reporters also have to remember that they are journalists first, and scientists second. “Sometimes having a background in science can be self-limiting because you want to obey the ethics of science, or ethics of research. [But] as a journalist you want to obey the ethics of journalism,” he said.
For example, scientific ethics might cause scientists to wait before publishing their findings to say whether something is right or wrong, while journalists have to work on what is best known at the time of writing, and clearly state that to readers.
In a recent article published in the Guardian , Professor Jonathan Wolff , head of philosophy at the University College London argued that journalists' focus on balance in their reporting has been problematically applied to coverage of scientific findings, too, helping create false notions of what is actually up for debate.
What kind of scientific research to trust
When searching for reports, be sure to use official sources such as the World Health Organization (WHO) , which is the authority when it comes to health issues, Jimoh explained.
“If it is a journal, one must look at where it was published. There are journals that we know are credible, then there are those that are more or less copycats, which just exist because of the funding they receive,” he said.
“Researchers spend a lot of their time in the laboratory, researching for 20-30 years until they find an answer. This will be a major finding and research also builds on existing work done by others,” said Jimoh.
Jimoh advised journalists to look at the funders supporting the research, “Ask if this is a commercial research, what the interests are, who is sponsoring it, which acknowledgement surrounds that particular publication. Look at the literature review, does it cover gaps? Look at the list of the references. Does the research give a new perspective? These are factors we need to constantly look at the end of the day.”
Components of a scientific research
When looking at a research report, most can be broken down into the following core components:
- Results/findings
- Discussions of results, conclusions and recommendations
- References could provide further opportunities for news features
- Acknowledgements
In these core components, the conclusions or recommendations are often the most useful for journalists . For both scientific reports and journals, your results or findings give you perspective on what the researchers found, according to Jimoh.
“The references and citations can also direct you to other work if for example you want to do a news feature and need more reports and links. Knowing how to read a report is only half of the work: the other half requires staying on top of current trends and discoveries by reading frequently and across multiple disciplines,” said Jimoh.
“We need to monitor and read widely in terms of key scientific or research issues, such as climate change, agriculture, engineering, etc. We need to create time to read because as a science journalist you need to keep reading and noting what is going on,” acknowledged Jimoh.
Jimoh concluded by saying that science journalists do not report in a vacuum, and at the end of the day their responsibility is to educate audiences on the current issues. “We are able to address policy-related issues from what we are writing. Once we make audiences aware of what is going on, that goes a long way.”
Additional resources
- EurekAlert! Science News Releases
- Alpha Galileo
Specific research journals
- Climate Change 2022: Mitigation of Climate Change
Latest News
Announcing the hans staiger investigative reporting award.
The International Center for Journalists (ICFJ) is launching an investigative reporting award in honor of Hans A. Staiger, a champion of journalism who devoted more than two decades of his life to helping fellow reporters across the world.

How Ukrainian Journalists Have Been Impacted by the War
Application deadline extended to april 1 for global business journalism's 2023-2024 academic year.
Candidates interested in applying to join the Global Business Journalism program for the 2023-2024 academic year now have an extra month to apply.
You have a little extra time to apply to the Global Business Journalism program this year.
- Reference Manager
- Simple TEXT file
People also looked at
Perspective article, science training for journalists: an essential tool in the post-specialist era of journalism.
- Metcalf Institute for Marine & Environmental Reporting, Natural Resources Science, University of Rhode Island, Kingston, RI, United States
A majority of US adults are concerned about a rise in misinformation regarding current issues and events. The spread of inaccurate information via social media and other sources has coincided with a massive transition in the news industry. Smaller newsrooms now have fewer journalists, and their responsibilities have shifted toward producing more stories, more quickly, while contributing to their outlets’ blogs and social media feeds. Lean newsroom budgets also eliminated in-house professional development for journalists, making external training programs an essential vehicle for reporters and editors to gain new content knowledge, sources, and skills in a constantly evolving news landscape. The loss of specialized beat reporters in many newsrooms since the mid-2000s has made training especially critical for journalists covering complex, science-based topics such as climate change and public health. In the USA, relatively few organizations offer science training opportunities for journalists, but the need and demand for these programs are growing as newsrooms increasingly rely on generalist reporters to cover a wide range of scientific topics. This perspective summarizes the challenges that non-specialist reporters face in covering science-based stories and describes a successful training model for improving science and environmental news coverage to yield reporting that is not only accurate but also offers the nuance and context that characterizes meaningful journalism.
Americans have been bombarded with claims of “fake news” since November 2016, when Donald Trump began to reference the term following the US presidential election. The term had previously been used to refer to satirical television comedies such as “The Daily Show” and “The Colbert Report” that used a faux-journalistic format ( Borden and Tew, 2007 ). As of late 2016, however, “fake news” became part of the cultural zeitgeist in the USA, inspiring responses ranging from comedians’ punch lines to rumor-based vigilantism ( Fisher et al., 2016 ).
The purposeful spread of inaccurate information is nothing new, but a wide range of people have become concerned about fake news. In a December 2016 poll by the Pew Research Center (2016c) , 64% of US adults reported feeling that “fabricated news stories cause a great deal of confusion about the basic facts of current issues and events.” In this same survey, 84% of respondents reported feeling somewhat to very confident in their ability to detect fake news.
Their confidence seems at odds with the continuous spread of misinformation ( Chan et al., 2017 ). This has become a more pernicious problem in the era of social media, when anonymity and a much-accelerated version of the old-fashioned rumor mill ( Zubiaga et al., 2016 ) allow misinformation to be spread easily, quickly, and without fear of repercussion. The freedom to spread false information on social media is exacerbated by broader communication challenges related to cognitive bias, motivated reasoning, and increasingly deep identity divides along socio-economic, political, and/or cultural lines ( Kahan, 2015 ; Flynn et al., 2017 ). Selective exposure to specific information sources may be another culprit ( Boxell et al., 2017 ; Schmidt et al., 2017 ). As a result, people experience a daily flood of information that may or may not be accurate. Often, individuals are left to determine the legitimacy of this information on their own, through the disparate lenses of their own biases.
Against this backdrop, public discourse about environmental issues, especially climate change, has become a political minefield ( Painter, 2013 ; Kahan, 2015 ) in which science is often perceived as just another opinion, rather than a foundation for discussion about policy options and practical solutions.
Challenges for Journalists Covering Environmental Topics
How can we address this misinformation dilemma? What are the mechanisms for increasing access to accurate, objective information and facilitating informed public discourse on critical environmental issues?
There is no single answer to these questions, but one important piece of the solution is to ensure that news coverage is not only accurate but also clear and properly contextualized. News coverage remains influential in setting public agendas regarding what news consumers talk about and how policy makers respond, especially with regard to environmental issues ( Dunwoody and Peters, 1993 ; Boykoff and Yulsman, 2013 ; Hansen, 2015 ). Unfortunately, the journalism industry has suffered significant losses since the mid-2000s ( Friedman, 2015 ; Pew Research Center, 2016a ), resulting in newsrooms whose reporting staff bring a much reduced breadth of expertise ( Pew Research Center, 2013 ). The expectations of journalism in the era of social media pose another challenge to producing nuanced reporting. In many newsrooms, smaller reporting staffs’ expanded reporting duties are compounded by the requirements for crafting multiple blog and/or social media posts each day ( Friedman, 2015 ).
The challenge of providing news coverage that is simultaneously accurate, contextualized, and compelling is especially salient with regard to environmental stories. Massive newsroom layoffs affected mainstream news outlets’ science and environment coverage significantly, eliminating many of these specialty beats ( Bagley, 2013 ) and/or shifting these stories to less experienced reporters who function as generalists, rather than specialists ( Crow and Stevens, 2012 ; Boykoff and Yulsman, 2013 ). Environmental coverage is complicated by its necessary mixture of science, policy, and personal opinion. Reporters must navigate scientific research, sorting out areas of consensus and debate, and weigh scientific perspectives along with those of affected communities and political agendas. As news outlets have moved toward distributing, or “mainstreaming,” environmental stories across the newsroom ( Friedman, 2015 ), and assigning these stories to non-specialists, the quality of scientific content has suffered for a number of reasons.
First, very few US journalists bring a science background to their work. Sachsman et al. (2008) reported that 3% of US journalists had an undergraduate major in science. This is not a hindrance for all types of news coverage, but it is unrealistic to expect a reporter whose last formal experience with science may have come from high school or a single college course to identify the nuances in a scientific debate or recognize the larger environmental context that might be relevant to a particular story. Furthermore, a limited facility and confidence with probabilities and statistics among many journalists makes it difficult or impossible for them to critically analyze scientific claims and the risks of action or inaction ( Painter, 2015 ). In two surveys of journalism school administrators spanning 1997–2008, only 25% thought their students received sufficient statistical instruction, leading the study authors to describe training in statistical reasoning as the “castor oil of journalism pedagogy” ( Dunwoody and Griffin, 2013 ). Without these educational foundations, it is much easier to produce stories focusing on political debate or drama related to environmental issues ( Boykoff and Yulsman, 2013 ), or to simply report two opposing viewpoints, than it is to produce illuminating reporting that accurately translates areas of scientific consensus and debate. Nisbet and Fahy (2015) described this as a process leading to journalism “dominated by voices representing the tail ends of opinion.”
Second, environmental stories are inherently complex and, therefore, time intensive. Depending on the audience, a single story about the effects of sea level rise on a coastal community in Rhode Island, for example, could be informed by researchers studying rates of glacial melting in the Arctic, loss of coastal wetlands in southern New England, coastal engineering, and economic effects on tourism-reliant businesses, in addition to community members and government officials. While this diverse blend of sources could lead to an informative and well-contextualized story, it would also require more time for reporting and an ability to weave the science and engineering background in with the political and personal perspectives. It is not surprising that a reporter without a science background, in a newsroom that expects multiple stories to be filed each day, might default to a one- or two-source story lacking broader context and insights ( Gibson et al., 2016 ).
Third, scientists’ ineffective communication styles impede clear summaries of their work. Academics’ use of jargon, as well as their reticence to comment on the broader significance of their research (or even speak with a reporter), can make it difficult or impossible to use their quotes or insights within a news story. It is easy for a journalist who lacks a familiarity with the process and culture of science to be swayed by the clear and compelling, but not necessarily accurate, arguments of a politician, activist group, or a vocal community member.
This is certainly not a complete list: framing, editorial disinterest, media ownership, and many other issues could be added to the list of complicating factors for environmental reporting ( Boykoff and Yulsman, 2013 ; Anderson, 2015 ). As a result of the specific challenges identified here, however, journalists who are new to covering science-based stories, or who do so only on occasion, are at a distinct disadvantage, for which news consumers pay the price.
Science Training for Journalists
Training journalists to become more discerning translators of scientific information is one mechanism for addressing these challenges. This type of professional development can build journalists’ understanding of scientific methods and uncertainty and help them place environmental stories within a broader scientific context, giving news audiences a much richer suite of information from which to form their opinions.
The University of Rhode Island’s Metcalf Institute for Marine & Environmental Reporting has conducted 54 science trainings for journalists since 1999. Over this time, Metcalf training has evolved to accommodate the needs, interests, and time constraints of professional journalists. The Institute currently offers a range of programs that allow more comprehensive learning over the course of a week, intensive 1–2 days science seminars that explore the science underlying specific environmental topics, conference-based programs that provide brief introductions to issues, and webinars that expose participants to individual speakers with expertise in environmental science, policy, and/or communication.
Metcalf Institute’s Annual Science Immersion Workshop for Journalists provides a rare deep-dive into the process of conducting scientific research. The Annual Workshop Fellowship introduces journalists to the science of global change with a focus on coastal ecosystems. The hands-on experiences in the lab, field, and classroom give Annual Workshop Fellows a greater familiarity with environmental science and access to a wide range of sources and scientific resources. The most important objectives are more fundamental, however. The Workshop facilitates off-deadline conversations between scientists and journalists that explore the slow, iterative process of research; explain how researchers work to minimize and manage scientific uncertainty; and build mutual understanding about the cultures and norms of both science and journalism. These interactions between journalists and scientists and also among the journalist Fellows change participants’ approaches toward reporting on science-based topics ( Smith et al., 2017 ), while helping participating scientists hone their own communication skills.
The demand for this type of training is significant. Metcalf Institute typically receives more than 100 applications for the 10 available Annual Workshop fellowship spots. A growing number of applicants are based outside of the USA, often in developing nations where journalism training is scarce and training related to environmental reporting is even harder to come by. Many of these applicants live in places where environmental issues are an essential underpinning of socio-political concerns, yet the environment receives minimal or no coverage.
Interest in the shorter science seminars for journalists is also intense, attracting applicants from across the USA, from large and small news outlets and from all media types. In short, there is a substantial demand for journalist training on environmental issues that, in spite of the best efforts of Metcalf Institute and a small band of other programs with similar goals, is not being met.
These professional development opportunities for journalists are essential in a constantly evolving news landscape. Newsrooms no longer provide the training or in-house resources that once supported the development of novice reporters and advanced the capacity of more experienced reporters. Meanwhile, environmental challenges—and their solutions—are growing apace, driving a largely unmet demand for environmental news coverage ( Miller and Pollak, 2012 ).
Yet, funding for this type of training has become ever more challenging. While many organizations and individuals lament the superficial or insufficient news coverage of environmental issues, relatively few funders have stepped forward to provide substantive or consistent support for journalist training in this arena. In addition, there has been an expectation in recent years from some foundation funders that journalist training programs must result in a specified number of news stories by participants. This runs counter to the approach taken by Metcalf Institute and most of its sister organizations. These training programs always yield stories, but the Institute does not simply seek an output of “x stories per participant” after a training. Rather, Metcalf’s goal is to change participants’ approaches toward covering science-based stories for the rest of their careers.
Analyses of Metcalf training based on pre- and post-training self-efficacy surveys, content analysis, and interviews indicate that this more ambitious and long-term approach is effective. A study of Annual Science Immersion Workshop alumni showed positive changes in efficacy related to their confidence in covering scientific issues, understanding of how scientists conduct research, and ability to discern the credibility of scientific sources ( Smith et al., 2017 ). This study also found a domino effect of Metcalf training, with 90% of survey respondents reporting that they had shared information from their training experience with colleagues. Finally, pre- and post-training analyses of participants’ reporting showed changes in their framing of environmental topics, with post-training stories offering a broader scientific context and more frequent references to scientific uncertainty.
Clearly, science training for journalists can address some of the barriers journalists face when covering environmental topics. This training has become far more essential as environmental coverage has shifted away from specialist reporters, especially in non-elite newsrooms that serve smaller local and regional audiences.
Learning from Experience
For many years, the Annual Science Immersion Workshop for Journalists was designed to give participants a better understanding of scientific principles through personal experience: guiding them through an accelerated tour of “a day in the life of a coastal scientist,” from hypothesis generation to data collection and data analysis. This approach changed journalists’ perspectives, helping them to understand the deliberation and iteration that characterize scientific research. Interviews conducted by Smith et al. (2017) , however, revealed that some participants were more interested in the big picture than in the details of data collection or analysis. Specifically, interviewees noted the benefits of gaining a better sense of the “humanity” of scientists through their interactions at the Workshop and learning from scientists how to critically evaluate scientific publications.
Metcalf Institute adapted the Annual Workshop in response to the Smith et al. study. The program still offers an intense dive into global change science with a focus on coastal zones, but the activities are designed to offer more applied experiences that prepare trainees to approach any scientific topic with greater discernment and confidence.
In practice, this programmatic shift translates to a more explicit and iterative examination of topics that many reporters struggle to convey, e.g., scientific uncertainty, probabilities and statistics, and data visualization. The Workshop also features more interactive co-learning opportunities, such as role-play exercises and “shop talk” sessions that allow journalists and scientists to discuss how the Fellows might apply their new knowledge and skills to specific challenges they face in their reporting. While science still takes center stage in the program, it is supported by activities and interactions that encourage participants to question their approaches toward covering these topics. Critically, the experience also builds journalists’ confidence in asking questions about scientists’ research motivations and conclusions—information that can enrich participants’ reporting on a range of topics.
Training as a Tool for Optimizing the Post-Specialist Era
Some have called for a turn toward “knowledge-based journalism,” which would apply specialized expertise in relevant natural and social science to improve reporting related to both the content and process of public affairs issues such as the environment ( Patterson, 2013 ; Donsbach, 2014 ; Nisbet and Fahy, 2015 ). This ideal and especially the approaches identified by Nisbet and Fahy (2015) for achieving it are worth striving for and cultivating. Yet, there are far more non-specializing journalists covering environmental stories today than there are specialists and the news industry needs ways to improve the coverage produced by this larger, inexpert group. Local news outlets, for example, are highly unlikely to be able to accommodate the knowledge-based journalism approach, yet these outlets play a significant role in public discussion within their target constituencies ( King et al., 2017 ). Reich and Godler (2016) offered a potent critique of this debate when they argued that it is “more urgent to develop novel ways to optimize and cope with non-specialization rather than to lament its arrival.”
Larger, well-funded news outlets and smaller non-profit outlets will likely maintain some degree of specialized reporting on the environment. However, it is equally—and perhaps more ( Pew Research Center, 2016b ; King et al., 2017 )—important to ensure reporters at smaller, local news outlets have received sufficient training to build a basic fluency with the fundamental assumptions, limitations, and norms of scientific research and the confidence to pursue science-based stories. Metcalf Institute’s outcomes demonstrate that training via professional development is an effective tool for optimizing the science reporting skills of specialists and non-specialists, alike.
Journalism has played an essential role in public discourse for hundreds of years. As the industry continues its search for a successful, sustained business model in the Internet Age, the demands upon individual journalists and the public need for substantive reporting that counters misinformation continue to grow. In the meantime, newsrooms and journalism funders must use the available tools to facilitate the best possible journalism. A more widespread commitment to journalists’ ongoing professional development is an essential step in this process.
Author Contributions
SM is the sole author of this work. Any correspondence should be sent to her.
Conflict of Interest Statement
The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
The author has not received any funding for this perspective paper.
Anderson, A. (2015). “News organization(s) and the production of environmental news,” in The Routledge Handbook of Environment and Communication , eds A. Hansen, and R. Cox (New York, NY: Routledge), 175–183.
Google Scholar
Bagley, K. (2013). About a Dozen Environment Reporters Left at Top 5 U.S. Papers. InsideClimate News . New York, NY. Available at: https://insideclimatenews.org/news/20130114/new-york-times-dismantles-environmental-desk-climate-change-global-warming-journalism-newspapers-hurricane-sandy
Borden, S., and Tew, C. (2007). The role of journalist and the performance of journalism: ethical lessons from “fake” news (seriously). J. Mass Media Ethics 22, 300–314. doi: 10.1080/08900520701583586
CrossRef Full Text | Google Scholar
Boxell, L., Gentzkow, M., and Shapiro, J. M. (2017). Is the Internet Causing Political Polarization? Evidence from Demographics, NBER Working Paper, No. 23258. Available at: http://www.nber.org/papers/w23258
Boykoff, M. T., and Yulsman, T. (2013). Political economy, media, and climate change: sinews of modern life. WIREs Clim. Change 4, 359–371. doi:10.1002/wcc.233
Chan, M.-P. S., Jones, C. R., Jamieson, K. H., and Albarracín, D. (2017). Debunking: a meta-analysis of the psychological efficacy of messages countering misinformation. Psychol. Sci. 28, 1531–1546. doi:10.1177/0956797617714579
PubMed Abstract | CrossRef Full Text | Google Scholar
Crow, D. A., and Stevens, J. R. (2012). Local science reporting relies on generalists, not specialists. Newspap. Res. J. 33, 35–48. doi:10.1177/073953291203300303
Donsbach, W. (2014). Journalism as the new knowledge profession and the consequences for journalism education. Journalism 15, 661–677. doi:10.1177/1464884913491347
Dunwoody, S., and Griffin, R. J. (2013). Statistical reasoning in journalism education. Sci. Commun. 35, 528–538. doi:10.1177/1075547012475227
Dunwoody, S., and Peters, H. P. (1993). “The mass media and risk perception,” in Risk as a Construct , ed. B. Ruck (Munich: Knesebeck), 293–317.
Fisher, M., Cox, J. W., and Hermann, P. (2016). Pizzagate: From Rumor, to Hashtag, to Gunfire in D.C . Washington, DC: Washington Post. Available at: https://www.washingtonpost.com/local/pizzagate-from-rumor-to-hashtag-to-gunfire-in-dc/2016/12/06/4c7def50-bbd4-11e6-94ac-3d324840106c_story.html?utm_term=0.295f2bb36990
Flynn, D. J., Nyhan, B., and Reifler, J. (2017). The nature and origins of misperceptions: understanding false and unsupported beliefs about politics. Adv. Pol. Pyschol. 38, 127–150. doi:10.1111/pops.12394
Friedman, S. (2015). “The changing face of environmental journalism in the United States,” in The Routledge Handbook of Environment and Communication , eds. A. Hansen, and R. Cox (New York, NY: Routledge), 143–155.
Gibson, T. A., Criag, R. T., Harper, A. C., and Alpert, J. M. (2016). Covering global warming in dubious times: environmental reporters in the new media ecosystem. Journalism 14, 417–434. doi:10.1177/1464884914564845
Hansen, A. (2015). “News coverage of the environment: a longitudinal perspective,” in The Routledge Handbook of Environment and Communication , eds A. Hansen, and R. Cox (New York, NY: Routledge), 208–215.
Kahan, D. M. (2015). Climate science communication and the measurement problem. Adv. Pol. Psychol. 36, 1–43. doi:10.1111/pops.12244
King, G., Schneer, B., and White, A. (2017). How the news media activate public expression and influence national agendas. Science 358, 776–780. doi:10.1126/science.aao1100
Miller, T., and Pollak, T. (2012). Environmental Coverage in the Mainstream News: We Need More. Project for Improved Environmental Coverage . SEE Innovation, 24. Available at: https://climateaccess.org/system/files/PIEC_Environmental%20Coverage.pdf
Nisbet, M., and Fahy, D. (2015). The need for knowledge-based journalism in politicized science debates. Ann. Am. Acad. Polit. Soc. Sci. 658, 223–234. doi:10.1177/0002716214559887
Painter, J. (2013). Climate Change in the Media . London: I.B. Tauris & Co. Ltd. In Association with the Reuters Institute for the Study of Journalism, University of Oxford.
Painter, J. (2015). Taking a bet on risk. Nat. Clim. Change 5, 286–288. doi:10.1038/nclimate2542
Patterson, T. E. (2013). Informing the News: The Need for Knowledge-Based Journalism . New York, NY: Vintage.
Pew Research Center. (2013). The State of the News Media 2013 . Available at: http://assets.pewresearch.org.s3.amazonaws.com/files/journalism/State-of-the-News-Media-Report-2013-FINAL.pdf
Pew Research Center. (2016a). The State of the News Media 2016 . Available at: http://assets.pewresearch.org/wp-content/uploads/sites/13/2016/06/30143308/state-of-the-news-media-report-2016-final.pdf
Pew Research Center. (2016b). Civic Engagement Strongly Tied to Local News Habits . Available at: http://www.journalism.org/2016/11/03/civic-engagement-strongly-tied-to-local-news-habits/
Pew Research Center. (2016c). Many Americans Believe Fake News Is Sowing Confusion . Available at: http://www.journalism.org/2016/12/15/many-americans-believe-fake-news-is-sowing-confusion/
Reich, Z., and Godler, Y. (2016). “The disruption of journalistic expertise,” in Rethinking Journalism Again: Societal Role and Public Relevance in the Digital Age , eds C. Peters, and M. Broersma (New York, NY: Routledge), 64–81.
Sachsman, D. B., Simon, J., and Valenti, J. M. (2008). Enviornment reporters and U.S. journalists: a comparative analysis. Appl. Environ. Educ. Commun. 7, 1–19. doi:10.1080/15330150802194862
Schmidt, A. L., Zollo, F., Del Vicario, M., Bessi, A., Scala, A., Caldarelli, G., et al. (2017). Anatomy of news consumption on Facebook. Proc. Natl. Acad. Sci. U.S.A. 114, 3035–3039. doi:10.1073/pnas.1617052114
Smith, H., Menezes, S., and Gilbert, C. (2017). Science training and environmental journalism today: effects of science journalism training for early to midcareer professionals. J. Appl. Environ. Educ. Commun. 1–13. doi:10.1080/1533015X.2017.1388197
Zubiaga, A., Liakata, M., Procter, R., Wong Sak Hoi, G., and Tolmie, P. (2016). Analysing how people orient to and spread rumours in social media by looking at conversational threads. PLoS ONE 11:e0150989. doi:10.1371/journal.pone.0150989
Keywords: journalism, training, professional development, science journalism, environmental journalism, science communication, environmental communication
Citation: Menezes S (2018) Science Training for Journalists: An Essential Tool in the Post-Specialist Era of Journalism. Front. Commun. 3:4. doi: 10.3389/fcomm.2018.00004
Received: 11 November 2017; Accepted: 12 January 2018; Published: 30 January 2018
Reviewed by:
Copyright: © 2018 Menezes. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Sunshine Menezes, sunshine@uri.edu
This article is part of the Research Topic
Science and Environmental Journalism: Trends, Boundaries, and Opportunities for a Rapidly Evolving Industry

Journalists and Scientists Have Different Roles, But They Share a Goal — An Informed Public
Back to Winter 2010 Newsletter
Melanie Lenart teaches a University of Arizona course on environmental writing in the department of Soil, Water and Environmental Science. Her book, Life in the Hothouse: How a Living Planet Survives Climate Change, is scheduled for release in April by the University of Arizona Press.
Sometimes a 900-word column in a major newspaper can bring more attention to the nation’s pending water shortage than a year’s worth of scientific papers. When Unquenchable author Robert Glennon of the University of Arizona’s law college wrote a Washington Post column called “Our Water Supply, Down the Drain,” his words potentially reached some three-quarters of a million readers—with internet access expanding potential readership by millions.
The hope of reaching more people is one reason some scientists and policy experts are writing for newspapers and other public venues. Another reason involves a growing tendency among grant-providing agencies to require public outreach. Finally, internet access is inspiring more scientists to share their thoughts directly with the public by posting details about their research and writing blogs.
Meanwhile, the number of working journalists keeps dropping as newspapers and magazines succumb to the overall economic downturn as well as hardships specific to publishing—which include the internet’s tendency to ignore copyrights.
With that in mind, I’d like to consider the many similarities and yet defining differences between science and journalism, and how we support practitioners. As a newspaper reporter who trained and worked as a scientist and now blends this into a science writing career, I have been exposed to the inner workings of both professions.
At their best, scientists and journalists both seek truth. This guiding principle enlightens research investigations, whether in the field and lab, or through legwork and interviews. Any type of research involves loads of background reading. In their articles, journalists and scientists both strive to set their own perspectives aside to consider other sides of an issue, with the understanding that new evidence could overturn a presumed truth.
The writing style they use differs greatly, with scientists favoring statistical analysis, passive voice and abundant journal references while journalists sprinkle their writing with anecdotes, quotes and real-world examples.
For writing to qualify as journalism, the writer should have no vested interest in the topic. In practice, a vested interest usually boils down to monetary terms—no payoff for a certain slant, no job with the organization featured, no stock in the company garnering the headlines. If conflicts exist, journalists are expected to mention them.
Articles are the product by which journalists make a living. Reporters receiving a weekly paycheck need to churn out daily articles to keep their jobs. Magazines thrive on freelance writing, typically paying by the word rather than the hours invested. Editors value independence more than affiliation.
Thus the source of support creates a big difference between scientists and journalists. Scientists often earn money from research grants or university appointments. Articles, they usually write for free. Journalists traditionally thrived indirectly on advertising dollars, for ads running in publications or broadcasts.
The easy access of the internet makes it challenging to raise enough money from advertising dollars, so publishers are looking for other avenues of support.
One avenue involves the so-called New Media approach. It incorporates the non-profit method to raise money, including by seeking grants and donations to research and write stories or support publications. High Country News and Mother Jones have courted donations for years, but most publications have relied only on advertising dollars and subscriptions for support.
It’s interesting that this latest twist on the media landscape would increase the similarities between journalists and scientists. At the moment, though, far fewer grant opportunities exist for those doing journalistic research than for those doing scientific research.
Yet we can’t expect scientists and policy experts in academic institutions to meet the need for informing the public. These scholars are busy doing research and writing papers for peer-reviewed journals, organizing conferences and workshops, and training and educating students. Generally, those in charge of promotions rank science writing and other types of outreach efforts far behind peer-oriented work.
And we can’t expect the media to provide the depth of scientific information needed to keep the public informed. Journalists are busy keeping an eye on government, with a constitutional role as government watchdogs.
Somewhere between science and traditional journalism lies the art of communicating about science beyond or behind the issues of government policy. The internet suits this mode. It can provide depth by allowing readers to “drill down” into a topic by following links to increasingly specific details.
People seek accessible information. This struck me again when I bumped into Robert Glennon at the Paradise Café while working on this guest view. His book Unquenchable has made a splash on the public scene, even landing him an interview with Jon Stewart on The Daily Show.
When I asked how the book was doing, Glennon noted Unquenchable had passed a mark reserved for the top 2 percent of books by selling more than 5,000 copies. Not necessarily big bucks, considering that authors typically earn $1 or less per copy. Still, it has other rewards for those interested in spreading a message. “I get an invitation to speak maybe once a day,” Glennon said. “I have to do triage.”
Books for the general public and the internet both offer wonderful ways to share information on complex issues such as science. Scientists and journalists consistently agree that we need more science stories. So let’s make sure we find a way to support writers, both official journalists and other writers who help fill in the details on science topics.
That’s one way to help citizens understand the science of complex issues, including water policy. Quenching the public’s thirst for knowledge can lead to better decision making about these important issues of sustainability.

Trending Terms:
- Careers Home
- Careers Articles
- Employer Profiles
Scientists and Journalists: Worlds Apart
- By Laura Brockway

T he distrust of journalists by the science and technology community is more pronounced than that of the clergy, corporate leaders, the military, or politicians, according to research done by the First Amendment Center .
This is one of the most provocative ideas I gathered as one of the curious graduate students who attended a Science/Media Forum sponsored by the University of Alabama, Birmingham , and Research!America . The forum brought together scientists and local reporters from various media for a candid discussion about their necessary but often difficult relationship. The forum was mediated by Jim Hartz and Rick Chappell, authors of Worlds Apart: How the Distance Between Science and Journalism Threatens America?s Future . Participants discussed sources of the distrust and potential solutions, including actions scientists can take to improve understanding of science by the public and the media.
The need for public understanding of science
Government funding, through tax dollars, supports more than half of all scientific research in the United States. In order to maintain that level of funding, the public needs to support the work of scientists. Some might not think we are in danger of losing such support, riding the success of the campaign to double the National Institutes of Health (NIH) budget. In general, the U.S. public supports scientific research and values science stories in the media. Nonetheless, most do not understand the fundamentals of research or scientific processes. This makes it difficult for them to comprehend highly specialized areas of current research. This, in turn, can have a direct impact on support for science: A report by the Department of Energy, addressing the crisis of its dwindling purchasing power, identifies a lack of appreciation of its research by the public as a source of this problem.
The President?s Committee of Advisors on Science and Technology Statement of Principles asserts that ?Public support of science and technology should be considered as an investment for the future.? Initiatives such as Project 2061 are addressing the need for improved science education in order to keep U.S. science and technology at pace with that of other countries. Outreach programs also promote science at the local level. Research institutions create jobs and increase the quality of life in the community.
Differences that affect the communication process
In the daily grind of lab life, scientists rarely focus their attention on such difficult issues. Pushed aside, scientists typically address this need only when the gap affects their own research niche. Scientists communicate their work primarily through scholarly publications and presentations at scientific meetings. Indeed, the peer-reviewed article is frequently described as the scientist?s currency. But in both cases--journal articles and conferences--it is scientists talking to other scientists.
In this way, scientists restrict the flow of information between the scientific community and the public. But what role do journalists play? There is a sincere lack of understanding between journalists and scientists for many reasons. The problem is pervasive and, as touched upon in the forum, has many critical consequences for our society.
Most journalists at the forum said that conveying science is a difficult task. Often scientists cannot translate the implications of their work in plain English. Scientists are most comfortable talking about their research using specialized jargon easily understood by other scientists. Moreover, scientists are often secretive about preliminary data in fear of being wrong or being ?scooped.? This might be particularly true in today?s climate, which emphasizes the proprietary value of patents and intellectual property. There is also the remnant influence of Cold War science that might be reviving following the World Trade Center attacks. In addition, we often believe that our work speaks for itself and does not need promotion. In fact, colleagues might shun scientists promoting science rather than doing science. Finally, according to Worlds Apart , most scientists do not trust journalists to report science accurately. They fear that if a reporter makes a mistake, some might think the scientist was to blame. Another contributing factor is that many journalists lack knowledge of the scientific process and are frustrated at the tentativeness on the part of scientists.
Dissimilarities between journalism and science
Journalists acknowledge that journalism is a business. Ratings are essential to survival. A Center for Media and Public Affairs report found that local newscasts devoted 7% of their stories to health news--less than crime, weather, accidents/disasters, and human interest stories. The average health report lasted 2 minutes, and most focused on causes and treatments of diseases. The study also found that reporters without a specific beat reported the majority of health news. When asked about the criteria for choosing science or health stories, journalists at the forum agreed that their focus is often on scientific breakthroughs. Taken together, these findings mean that local newscasts cover only a marginal amount of science.
In contrast to the fast-paced field of journalism, science is incremental and slow. Scientists test theories in multiple ways, and even then, we can come up with alternative ways to explain the data that might argue in support of or refute the hypothesis. A great deal of science is sometimes curiosity driven and, although not directly related to humans or disease, it is also important but usually is not the subject of media attention.
Solutions for scientists
So what can we, as scientists, do to foster a better appreciation of our science in the public? First, learn how to translate science into understandable language. Play the role of the ?civic scientist,? or those scientists who actively engage in promoting research to nonscientists. Seek out broader audiences to whom you can display your science in order to promote the importance to the public. Pulitzer Prize-winning planetary scientist Carl Sagan had many accomplishments, but, arguably, one of his most critical was his tireless promotion of science.
At the forum, Guy Caldwell of the University of Alabama, Tuscaloosa, said that ?as taxpayers would uniformly support a war that they understand (e.g., terrorism), they should be made to understand that biomedical researchers are at the frontiers of a different war--one versus disease, a war that claims far more lives every year than all other wars combined.? Caldwell believes that it is the scientist?s obligation to inform the public about these issues.
Hart and Chappell suggests that along with every scientific article, a public abstract in plain English (100 words or less with no jargon) be posted on the Internet for the public as well as journalists. Undoubtedly this solution would help scientists as well as journalists. University public relations professionals can serve as liaisons between the scientific community and the media. In addition, scientists can help journalists understand the scientific peer-review process to avoid overplay of preliminary work by the media.
Scientists--professors and graduate students alike--who actively think about the ramifications of their work and talk with journalists as well as friends and family about what they do in the lab--are helping the cause of science.
Is the only reason to inform the public about science to secure future funding? Is the only way the public can participate through funding? I suspect that this kind of thinking is part of the reason that the gap between science and the public continues to grow. Journalists have a role to play. But scientists also need to begin to think more generously of their relationship with the public.
About the author
Laura brockway, more from careers.

- Armando Andres Roca Suarez

- Jocelyn Kaiser

- Yana Suchikova
SIGN UP FOR OUR CAREERS NEWSLETTER
Support nonprofit science journalism.
Help News from Science publish trustworthy, high-impact stories about research and the people who shape it. Please make a tax-deductible gift today.
If we've learned anything from the COVID-19 pandemic, it's that we cannot wait for a crisis to respond. Science and AAAS are working tirelessly to provide credible, evidence-based information on the latest scientific research and policy, with extensive free coverage of the pandemic. Your tax-deductible contribution plays a critical role in sustaining this effort.
How to Hook Reporters on Your Science Research
January 28, 2021
Journalists shared these 12 tips on how CUNY scientists can attract attention for and accurate coverage of their research, at a recent Graduate Center Science Communication Academy workshop.

By Bonnie Eissner
Filling the media and the public in on scientific breakthroughs isn’t easy, but it’s vital, especially in light of a pandemic, climate change, and other urgent issues. That was one of the main takeaways from a recent Graduate Center Science Communication Academy workshop, “Meet the Reporter: Shaping STEM Research for General Media.”
CUNY science faculty and graduate students were treated to expert advice from Shawn Rhea , media relations manager at The Graduate Center, and CUNY Newmark J-School Professor Emily Laber-Warren , a veteran journalist and director of the J-School’s health and science reporting program. Twenty J-School graduates, all working journalists, joined to conduct mock interviews with the scientists and offer tips and feedback.
Rhea and Laber-Warren shared these tips on how to scientists can draw media attention for their research.
1. Ask Yourself “Why Now?”
Even though science is incremental, and there’s rarely an aha moment, journalists “need to be able to hook a story to something that just happened or that recently happened or is trending,” Laber-Warren said.
2. Know Why Your Research Matters
Laber-Warren advised that scientists step back from their research and assess why their study matters to someone who isn’t knee-deep in it. Consider the larger questions that the work addresses.
3. Use Accessible Language
Precision is vital in science, and reporters need to be accurate, but they also need to use language that the public can understand and easily read. “In order to reach a general audience, you’re going to need to be comfortable with dialing down the precision a little bit,” Laber-Warren said.
“For a lay audience, sometimes less is more,” Rhea advised. She added that anecdotes and even generalizations can “come in really handy.” Also, use short, concise sentences when possible. “This will help your audience better absorb ideas,” Rhea said.
4. Metaphors and Analogies Make Your Work More Vivid
“Having metaphors in your back pocket is a great thing,” Laber-Warren said. As an example, she described talking to an immunologist for a COVID-19 story she was covering. He used the term “Goldilocks situation” when comparing COVID-19 to tuberculosis. She not only quoted him, but adapted his quote for the headline .
5. Emotion Is Important
Readers and audiences are interested in the human side of science. They want to know about the scientist’s experience. Laber-Warren advised the scientists to reflect on how they felt when they made their discovery. “Were there any surprises, or obstacles, or funny things that happened along the way? Scientists often take themselves out of their work. … We want you to put yourself back into it a little bit.”
6. Try a Catchy Title
Reporters scan through tons of studies, and you can help them out with an intriguing or chatty title.
7. Hook Reporters with the Abstract
Add a couple of sentences to your abstract to describe why the research matters and how it might impact issues or causes that you and others care about.
8. Don’t Wait Until Your Research Is Published
Ideally, journalists will cover your research on the same day the study comes out in a journal. Scientists at The Graduate Center should reach out to Shawn Rhea as soon as a study is accepted by a journal.
9. Communicate the New Information First
When writing an op-ed on your work or pitching or talking to the press, communicate the new information first. Rhea advised, “Tell the public the bottom line first. This is what the discovery is. This is why it’s important, and then you can go into what your supporting details are.”
Graduate Center students and faculty are invited to participate in The Thought Project blog on Medium. CUNY’s SUM website also covers research by CUNY students and professors. See its submission guidelines . 10. Know Your Audience
You may need to adjust your “why it matters” message and the specificity of your language based on your audience, and think about the reporter’s audience, whether it’s the general public or a more scholarly readership, including peers or potential funders. 11. Cultivate Your Online Profile
Whether you’re a student researcher or a faculty member, develop an online profile that covers the different areas you study and the topics or areas you’re passionate about.
12. Join Experts Lists
Make sure you’re on the relevant databases that journalists use, such as SciLine , 500 Women Scientists , Diverse Sources , etc. Other databases are listed on The Graduate Center’s Science Communication Academy website .

More from the Science Communication Academy
Scientists will find a wealth of resources, including a video of this presentation on The Graduate Center Science Communication Academy website . CUNY scientists are invited to join the webinars , held at 2 p.m. on the third Friday of every month.
Published by the Office of Communications and Marketing .
- Student News
- Earth and Environmental Sciences
- Nanoscience
- Neuroscience
- Communications and Marketing
News & Findings
You May Also Like

- Alumni News
- Faculty News
Celebrating Women’s History and History-Makers
Our scholars are shedding light on the histories of women and making history themselves.
Science Student Spotlight: Samantha Delaney
The Biochemistry Ph.D. student received a prestigious National Cancer Institute fellowship for her research on colorectal cancer tumors.

- Research News
Happy Valentines Day! Some Thoughts on Sexual Pleasure Equity from a GC Professor
A psychologist shares her findings and advice on creating strong, satisfying romantic relationships.

IMAGES
VIDEO
COMMENTS
In the interview setting, these dynamics are often manifested as reporters' efforts to get researchers to state the practical implications of the research
When reviewing a study, journalists need to determine which bucket a study fits into. The advantage of this approach to evaluate medical science
A study from the University of Michigan found that journalists tend to understate the claims of scientific papers. As part of their research
Before you read about a study in The Times, reporters will have looked into the researchers' backgrounds and often consulted three to five
However, science reporters also have to remember that they are journalists first, and scientists second. “Sometimes having a background in
This approach changed journalists' perspectives, helping them to understand the deliberation and iteration that characterize scientific research
Any type of research involves loads of background reading. In their articles, journalists and scientists both strive to set their own perspectives aside to
A piece of research is not completed until a report is published. However, this refers mainly to communication within the scientific.
The need for public understanding of science Government funding, through tax dollars, supports more than half of all scientific research in the United States.
Precision is vital in science, and reporters need to be accurate, but they also need to use language that the public can understand and easily