@article{Kessler_2021, title={Scientific evidence/uncertainty (Science and Health Communication)}, volume={1}, url={https://www.hope.uzh.ch/doca/article/view/2h}, DOI={10.34778/2h}, abstractNote={<p>The presented scientific evidence and uncertainty in science communication can be achieved by either different variables (e.g., Brechman, Lee, & Cappella, 2009, 2011; Guenther, Bischoff, Löwe, Marzinkowski, & Voigt, 2019; Kessler, 2016) or identifying frames (for thematic frames, see Ruhrmann, Guenther, Kessler, & Milde, 2015; for formal-abstract frames, see Kessler, 2016).</p> <p><strong><em>Field of application/theoretical foundation:</em></strong></p> <p>Evidence and (un)certainty are integral components of scientific findings and science in general. Scientific evidence can be defined as a continuum, ranging from scientific uncertainty to certainty and from weak to strong evidence. Media content analyses are investigating the extent to which information is given in media articles that provide indications of the evidence or uncertainty of scientific findings. Content Analyses also measure how evident scientific findings are presented in the media.</p> <p><strong><em>References/combination with other methods of data collection:</em></strong></p> <p>In some cases, the effects of different uncertainty depiction styles (Retzbach & Maier, 2015) and frames of the depicted evidence (Kessler, 2016) are examined after the content-analytical identification in experiments.</p> <p><em><strong>Example studies:</strong></em></p> <p>Brechman, et al. (2009); Brechman et al. (2011); <span style="font-family: ’Noto Sans’, ’Noto Kufi Arabic’, -apple-system, BlinkMacSystemFont, ’Segoe UI’, Roboto, Oxygen-Sans, Ubuntu, Cantarell, ’Helvetica Neue’, sans-serif;">Guenther et al. (2019); </span>Kessler (2016); Retzbach & Maier (2015); Ruhrmann et al. (2015)</p> <p> </p> <p><strong>Information on Guenther et al., 2019</strong></p> <p><strong>Authors: </strong>Lars Guenther, Jenny Bischoff, Anna Löwe, Hanna Marzinkowski, & Marcus Voigt</p> <p><strong>Research question: </strong>When they represent research results, how do German print and online media report on (a) relevant criteria to assess scientific evidence and (b) scientific (un)certainty?</p> <p><strong>Object of analysis: </strong>The study was based on a randomly selected artificial week to obtain a representative sample of German print and online media reports on science (<em>N</em> = 128 articles).</p> <p><strong>Time frame of analysis: </strong>July 6, 2015 to August 23, 2015</p> <p><strong>Info</strong><strong> about variables</strong></p> <p><strong>Variables: “</strong>For each represented research result, a variable collected the main (hypo-)thesis of the research study, the direction of the result (for or against the thesis), as well as the relevant criteria to assess evidence. […] For each result, it was also relevant to collect to which extent scientific certainty or scientific uncertainty was discussed. In the current study, an explicit statement referring to (un)certainty was differentiated from an implicit statement (subjunctive, speculative language as an indicator of uncertainty versus imperative as an indicator for certainty). This was supplemented by collecting the justifications for (un)certainty that were given for the scientific results.” (p. 10)</p> <p><strong>Level of analysis: </strong>news article</p> <p><strong>Variables and values: </strong></p> <ul> <li>reported relevant criteria to assess scientific evidence: theoretical assumptions/(hypo-)theses; pilot study/a study never done before; research design: experiment, case study, etc.; research and measurement instruments; quality criteria, such as reliability; quality criteria, such as validity; references to significance (statistic values); objectivity; information about sample (size); time of study; explicit depiction of the research setting; number of studies done; information about how results were obtained; limitations, such as knowledge gaps; comparisons to other studies; funding source(s); reference to the investigating researcher(s); reference to the publication/ journal/ conference; future scenarios, specific applications</li> <li>reported explicit justifications for scientific (un)certainty: preliminary data, knowledge gap(s); (poor) methodological quality; contrasting findings of research; contrasting interpretation of same dataset; conflicting viewpoints of researchers; doubt whether data can be applied to humans; effect on humans not clear; effect on nature not clear; lack of technical/scientific opportunities; justifications for certainty; certain single result(s); sufficient data; (strong) methodological quality; results pointing in the same direction; successfully replicated findings; application for humans clear; effect on humans clear; effect on nature clear; highly experienced researcher(s)</li> <li>implicit statement referring to (un)certainty: no implicit representation vs. implicit representation</li> </ul> <p><strong>Reliability: “</strong>Four experienced coders coded the articles of the sample after several intensive training sessions. Intercoder reliability was calculated according to Holsti for 26 articles (20 percent of the sample) and the following satisfactory results were obtained: formal variables: 0.97; criteria relevant to assess evidence: 0.92; uncertainty (explicit and implicit): 0.95; certainty (explicit and implicit): 0.92.” (p. 10)</p> <p><strong>Codebook: </strong>in the appendix (in German)</p> <p> </p> <p><strong>Information on Kessler, 2016</strong></p> <p><strong>Author: </strong>Sabrina Heike Kessler                 </p> <p><strong>Research questions: </strong>How evident are medical issues presented in science TV programs? Are there any relationship between the individual types of evidence sources and the way they are presented? Can constant formal-abstract patterns/frames of presented evidence be identified?   </p> <p><strong>Object of analysis:  </strong>There was a full-sample content analysis of science TV programs about scientific and medical issues (<em>N</em> = 321, with <em>N</em> = 851 evidence source argumentations).</p> <p>Three frames of evidence identified via a cluster analysis. The frames differed significantly in their degree of depicting belief, doubt, and uncertainty, which were defined as the core elements of a frame of evidence.</p> <p><strong>Timeframe of analysis: </strong>August 1, 2011 to July 31, 2012 </p> <p><strong>Info about variables</strong></p> <p><strong>Variables: </strong>variables that measure the represented uncertainty in the argumentations of evidence sources and variables that determine the formal-abstract evidence frames.</p> <p><strong>Level of analysis:</strong> Science TV programs and evidence source arguments</p> <p><strong>Variables, values and reliability: <a name="_Toc406587068"></a></strong></p> <p>Intercoder reliability values of the coding separated by variables</p> <div style="overflow-x: auto;"> <table> <tbody> <tr> <td class="t"> <p>Variable</p> </td> <td class="t"> <p>Number of Possible Values</p> </td> <td class="t"> <p>Number of Codings</p> </td> <td class="t"> <p>Holsti Reliability Coefficient</p> </td> <td class="t"> <p>Cohen’s Kappa</p> </td> </tr> <tr> <td class="t"> <p>V5 (specific topic)</p> </td> <td class="t"> <p>1 to x</p> </td> <td class="t"> <p>30</p> </td> <td class="t"> <p>.93</p> </td> <td class="t"> <p>.92    </p> </td> </tr> <tr> <td class="t"> <p>V6 (general topic)</p> </td> <td class="t"> <p>1 to x</p> </td> <td class="t"> <p>30</p> </td> <td class="t"> <p>.91</p> </td> <td class="t"> <p>.90</p> </td> </tr> <tr> <td class="t"> <p>V7b (main thesis)</p> </td> <td class="t"> <p>x</p> </td> <td class="t"> <p>30</p> </td> <td class="t"> <p>.93</p> </td> <td class="t"> <p>.92</p> </td> </tr> <tr> <td class="t"> <p>V9 (number of evidence sources)</p> </td> <td class="t"> <p>1 to x</p> </td> <td class="t"> <p>30</p> </td> <td class="t"> <p>.98</p> </td> <td class="t"> <p>.82</p> </td> </tr> <tr> <td class="t"> <p>V10 (type of evidence source)</p> </td> <td class="t"> <p>6</p> </td> <td class="t"> <p>57</p> </td> <td class="t"> <p>.93</p> </td> <td class="t"> <p>.91</p> </td> </tr> <tr> <td class="t"> <p>V11 (validity of the evidence source)</p> </td> <td class="t"> <p>4</p> </td> <td class="t"> <p>52</p> </td> <td class="t"> <p>.90</p> </td> <td class="t"> <p>.85</p> </td> </tr> <tr> <td class="t"> <p>V12 (arguments for)</p> </td> <td class="t"> <p>3</p> </td> <td class="t"> <p>52</p> </td> <td class="t"> <p>.81</p> </td> <td class="t"> <p>.63</p> </td> </tr> <tr> <td class="t"> <p>V13 (arguments against)</p> </td> <td class="t"> <p>3</p> </td> <td class="t"> <p>52</p> </td> <td class="t"> <p>.92</p> </td> <td class="t"> <p>.73</p> </td> </tr> <tr> <td class="t"> <p>V14 (polarity)</p> </td> <td class="t"> <p>3</p> </td> <td class="t"> <p>52</p> </td> <td class="t"> <p>.99</p> </td> <td class="t"> <p>.92</p> </td> </tr> <tr> <td class="t"> <p>V15 (weighting)</p> </td> <td class="t"> <p>2</p> </td> <td class="t"> <p>52</p> </td> <td class="t"> <p>.96</p> </td> <td class="t"> <p>.58</p> </td> </tr> <tr> <td class="t"> <p>V16 (actuality)</p> </td> <td class="t"> <p>2</p> </td> <td class="t"> <p>52</p> </td> <td class="t"> <p>.95</p> </td> <td class="t"> <p>.54</p> </td> </tr> <tr> <td class="t"> <p>V17 (uncertainty explicit)</p> </td> <td class="t"> <p>3</p> </td> <td class="t"> <p>52</p> </td> <td class="t"> <p>.86</p> </td> <td class="t"> <p>.39</p> </td> </tr> <tr> <td class="t"> <p>V18 (implicit uncertainty)</p> </td> <td class="t"> <p>3</p> </td> <td class="t"> <p>52</p> </td> <td class="t"> <p>.79</p> </td> <td class="t"> <p>.45</p> </td> </tr> <tr> <td class="t"> <p>V19 (homogeneity)</p> </td> <td class="t"> <p>2</p> </td> <td class="t"> <p>52</p> </td> <td class="t"> <p>.95</p> </td> <td class="t"> <p>.51</p> </td> </tr> <tr> <td class="t"> <p>V20 (detailing)</p> </td> <td class="t"> <p>2</p> </td> <td class="t"> <p>52</p> </td> <td class="t"> <p>.92</p> </td> <td class="t"> <p>.41</p> </td> </tr> <tr> <td class="t"> <p>V21 (constancy)</p> </td> <td class="t"> <p>3</p> </td> <td class="t"> <p>52</p> </td> <td class="t"> <p>.93</p> </td> <td class="t"> <p>.71</p> </td> </tr> <tr> <td class="t b"> <p>V22 (secondary evaluation)</p> </td> <td class="t b"> <p>3</p> </td> <td class="t b"> <p>52</p> </td> <td class="t b"> <p>.67</p> </td> <td class="t b"> <p>.42</p> </td> </tr> </tbody> </table> </div> <p><strong>Codebook: </strong>in the appendix (in German)</p> <p> </p> <p><strong>References</strong></p> <p>Brechman, J. M., Lee, C., & Cappella, J. N. (2009), Lost in translation?: A comparison of cancer-genetics reporting in the press release and its subsequent coverage in the press. <em>Science Communication, 30</em>(4), 453-474. DOI: 10.1177/1075547009332649</p> <p>Brechman, J. M., Lee, C., & Cappella, J. N. (2011), Distorting genetic research about cancer: from bench science to press release to published news. <em>Journal of Communication, 61</em>(3), 496-513. DOI: 10.1111/j.1460-2466.2011.01550.x</p> <p>Guenther, L., Bischoff, J., Löwe, A., Marzinkowski, H., & Voigt, M. (2019). Scientific evidence and science journalism: Analysing the representation of (un)certainty in German print and online media. <em>Journalism Studies, 20</em>(1), 40-59.</p> <p>Kessler, S. H. (2016). <em>Das ist doch evident! Eine Analyse dargestellter Evidenzframes und deren Wirkung am Beispiel von TV-Wissenschaftsbeiträgen </em>(Reihe Medien + Gesundheit, Band 12). Baden-Baden: Nomos. DOI: 10.5771/9783845275468</p> <p>Retzbach, A., & Maier, M. (2015), Communicating scientific uncertainty: Media effects on public engagement with science. <em>Communication Research, 42</em>(3), 429-456. DOI: 10.1177/0093650214534967</p> <p>Ruhrmann, G., Guenther, L., Kessler, S. H. & Milde, J. (2015). Frames of scientific evidence: How journalists represent the (un)certainty of molecular medicine in science television programs. <em>Public Understanding of Science, 24</em>(6), 681-696. DOI: 10.1177/096366251351064</p>}, number={2}, journal={DOCA - Database of Variables for Content Analysis}, author={Kessler, Sabrina H.}, year={2021}, month={Mar.} }