Rationality (Online Discussions/Discussion Quality)

Authors

DOI:

https://doi.org/10.34778/5t

Keywords:

rationality, deliberation, deliberative quality, online discussions, discussion quality, discourse quality, arguments, reasoning, evidence, justification

Abstract

Rationality is considered the most important dimension to assess the deliberative quality of online discussions. In quantitative content analyses, it is usually measured with a set of variables, including (among others) reasoning, justification, fact claims, evidence, additional knowledge, and topic relevance.

 

Field of application/Theoretical foundation

Most studies on online discussions draw on deliberative theories to measure the quality of their discourse (e.g., Esau et al., 2017; Friess et al., 2021; Rowe, 2015; Ziegele et al., 2020; Zimmermann, 2017). Deliberation is an important concept for the study of (political) online discussions (Ziegele et al., 2020). It focuses on a free and equal exchange of arguments to bridge social differences and legitimize political decisions (Dryzek et al., 2019; Fishkin, 1991, Habermas, 2015). Rationality is considered the most important dimension of deliberative quality, which is inherent in most conceptualizations (Frieß & Eilders, 2015). Rationality is primarily about reasoning, justifications, and facts (Engelke, 2019). Discussion participants should provide justifications and evidence to support their positions (Friess et al., 2021). These reasons and arguments must be both criticizable and verifiable or falsifiable (Esterling, 2011; Habermas, 1995). Counterarguments and different perspectives should also be included (Engelke, 2019; Ziegele et al., 2020). This allows the elaboration of the best arguments in the deliberation process and an informed opinion formation based on these arguments (“the unforced force of the better argument”, Habermas, 2015). A rational discourse and a constructive discussion atmosphere are also considered necessary for reaching a rationally motivated consensus, a central aim of formal deliberation (Cohen, 1989; Friess & Eilders, 2015; Stromer-Galley, 2007).

 

References/Combination with other methods

Besides quantitative content analyses, the (deliberative) quality of online discussions is examined with qualitative content analyses and discourse analyses (e.g., Graham & Witschge, 2003; Price & Capella, 2002). Furthermore, participants’ perceptions of the quality of online discussions are investigated with qualitative interviews (e.g., Engelke, 2019; Ziegele, 2016) or a combination of qualitative interviews and content analysis (Díaz Noci et al., 2012).

 

Cross-references

Rationality is one of five dimensions of deliberative quality in this database written by the same author. Accordingly, there are overlaps with the entries on interactivity, inclusivity, explicit civility, and storytelling regarding theoretical background, references/combinations with other methods, and some example studies.

 

Information on Heinbach & Wilms (2022)

Authors: Dominique Heinbach & Lena K. Wilms (Codebook by Dominique Heinbach, Marc Ziegele, & Lena K. Wilms)

Research question: Which attributes differentiate moderated from unmoderated comments?

Object of analysis: The quantitative content analysis was based on a stratified random sample of moderated and unmoderated comments (N = 1.682) from the German online participation platform “#meinfernsehen202” [#myTV2021], a citizen participation platform to discuss the future of public broadcasting in Germany.

Time frame of analysis: November 24, 2020 to March 3, 2021

Info about variables

Level of analysis: User comment

Variables and reliability: see Table 1

Table 1: Variables and reliability (Heinbach & Wilms, 2022)

Dimension

Measure

Definition

Krippendorff’s α (ordinal)

Rationality

Topic relevance

Does the comment refer to the topic of the post?

.70

 

Fact claims

Does the comment contain at least one objectively falsifiable statement with a claim to truth?

.78

 

Reasoning

 

Does the comment contain at least one justification to support a statement (e.g., an assertion, opinion, or claim)?

.73

 

Solution proposal

Does the comment contain at least one suggestion on how to resolve problems or issues?

.75

 

Additional knowledge

Does the comment contain additional information that is of a knowledge nature of and adds content-related value?

.72

 

Genuine questions

Does the comment contain at least one question with a genuine need for

information, e.g. questions of knowledge, understanding, justification or opinion?

.75

 

 

 

n = 159, 3 coders

Values: All variables were coded on a four-point scale (1 = clearly not present; 2 = rather not present; 3 = rather present; 4 = clearly present). Detailed explanations and examples for each value are provided in the Codebook (in German).

Codebook: in the appendix of this entry (in German)

 

Information on Zimmermann (2017)

Author: Tobias Zimmermann

Research question: Which role do online reader comments play for a deliberative-democratic understanding of a digital public sphere? (p. 11)

Object of analysis: To compare discursive participation online and offline, the author conducted a full-sample content analysis of online reader comments (N = 1.176) and letters to the editor (N = 381) from German local newspapers on three similar conflicts in local politics concerning the renaming of streets and squares. Because the coding scheme was based on the discourse quality index (DQI), only contributions that contained a demand were included in the analysis, that is, “a proposal on what decision should or should not be made” Steenbergen et al., 2003, p. 27). Only then, a speech act is considered relevant from a discourse ethics perspective.

Time frame of analysis: June 2012 to May 2013

Info about variables

Variables: Based on the DQI (Steenbergen et al., 2003) the author operationalizes the level of justification as an indicator for rationality. This variable distinguishes four levels of justification (p. 164). Besides the ordinal variable “Level of justification”, the author also uses a dichotomous measurement to distinguish between substantiated and unsubstantiated claims.

Level of analysis: Individual contribution

Values: see Table 2

Table 2: Variables and Values (pp. 163-166; p. 188)

Variable

Value

Definition

Level of Justification

No justification

The author makes a demand without justifying it argumentatively. The demand stands for itself.

Indirect justification

The author introduces an argument but its connection to the demand is incomplete, or its justification is not falsifiable.

Qualified justification

An argument substantiates a demand. A (falsifiable) link is made as to why one should expect that X contributes to or detracts from Y.

Detailed justification

At least two complete justifications are given, either two complete justifications for the same demand or complete justifications for two different demands (broad justification). Or one justification explains the represented position in depth from several points of view (deep justification).

Justification

No justification

A user makes a demand that X should (not) be done or happen without giving a justification.

Justification

A user substantiates a demand why X should (not) be done or happen.

Reliability: Intracoder reliability was tested on a subset of 100 comments. The ordinal variable “level of justification” exceeded a Krippendorff’s Alpha above .73. The dichotomous variable “justification” reached a Krippendorff’s Alpha of .75 (p. 200-201).

Codebook: pp. 159-185 (in German)

 

Information on Ziegele et al. (2020)

Authors: Marc Ziegele, Oliver Quiring, Katharina Esau, & Dennis Friess

Research questions: RQ1: “Which news factors predict the civility and rationality of reactive user comments?” (p. 869) RQ3: “Which illustration factors predict civil and rational reactive user comments?” (p. 871)

Object of analysis: The quantitative content analysis was based on a sample of top-level comments (i.e., comments responding to the article) from the Facebook pages of nine established German news media outlets (N = 11.218). Three artificial weeks were constructed for the sampling of news articles and user comments. On each access day, three or four news articles and the corresponding user comments were randomly selected from each news page. Then, for each article, the oldest five top-level comments, the most recent five top-level comments, five random top-level comments from the middle of the discussion, and the five most popular comments were selected (20 comments per article) (pp. 872-873).

Time frame of analysis: May 2015 to August 2015

Info about variables

Level of analysis: User comment

Variables and reliability: see Table 3

Table 3: Variables and reliability (p. 874)

Dimension

Measure

Definition

Krippendorff’s α

Rationality

Topic relevance

Is the comment on-topic?

.67

 

Balance

Does the comment include a balanced view on the commented issue?

.74

 

Additional knowledge

Does the comment contain additional knowledge?

.79

 

Elaboration

Does the comment appear elaborate to the coders?

.81

 

Arguments

Does the comment provide reasons for its claims?

.74

 

Analytical

Does the comment analyze the background of the issue at hand?

.70

 

Factual claims

Does the comment provide facts and factual claims?

.72

 

Questions

Does the comment include genuine questions?

.80

 

 

 

n = 100, 9 coders

Values: “Each factor was coded on 3-point scales (0 = absent, 1 = sporadically present, 2 = highly present)” (p. 874).

 

Example studies

Esau, K., Fleuß, D. & Nienhaus, S.‑M. (2021). Different Arenas, Different Deliberative Quality? Using a Systemic Framework to Evaluate Online Deliberation on Immigration Policy in Germany. Policy & Internet, 13(1), 86–112. https://doi.org/10.1002/poi3.232

Esau, K., Friess, D. & Eilders, C. (2017). Design Matters! An Empirical Analysis of Online Deliberation on Different News Platforms. Policy & Internet, 9(3), 321–342. https://doi.org/10.1002/poi3.154

Friess, D., Ziegele, M. & Heinbach, D. (2021). Collective Civic Moderation for Deliberation? Exploring the Links between Citizens’ Organized Engagement in Comment Sections and the Deliberative Quality of Online Discussions. Political Communication, 38(5), 624–646. https://doi.org/10.1080/10584609.2020.1830322

Heinbach, D. & Wilms, L. K. (2022): Der Einsatz von Moderation bei #meinfernsehen2021 [The deployment of moderation at #meinfernsehen2021]. In: F. Gerlach, C. Eilders & K. Schmitz (Eds.): #meinfernsehen2021. Partizipationsverfahren zur Zukunft des öffentlich-rechtlichen Fernsehens. Baden-Baden: Nomos.

Monnoyer-Smith, L. & Wojcik, S. (2012). Technology and the quality of public deliberation: a comparison between on and offline participation. International Journal of Electronic Governance, 5(1), Artikel 47443, 24. https://doi.org/10.1504/IJEG.2012.047443

Rowe, I. (2015). Deliberation 2.0: Comparing the Deliberative Quality of Online News User Comments Across Platforms. Journal of Broadcasting & Electronic Media, 59(4), 539–555. https://doi.org/10.1080/08838151.2015.1093482

Stromer Galley, J. (2007). Measuring Deliberation's Content: A Coding Scheme. Journal of Public Deliberation, 3(1), Article 12.

Stroud, N. J., Scacco, J. M., Muddiman, A. & Curry, A. L. (2015). Changing Deliberative Norms on News Organizations' Facebook Sites. Journal of Computer-Mediated Communication, 20(2), 188–203. https://doi.org/10.1111/jcc4.12104

Ziegele, M., Quiring, O., Esau, K. & Friess, D. (2020). Linking News Value Theory With Online Deliberation: How News Factors and Illustration Factors in News Articles Affect the Deliberative Quality of User Discussions in SNS’ Comment Sections. Communication Research, 47(6), 860-890. https://doi.org/10.1177/0093650218797884

Zimmermann, T. (2017). Digitale Diskussionen: Über politische Partizipation mittels Online-Leserkommentaren. Edition Politik: Bd. 44. transcript Verlag. http://www.content-select.com/index.php?id=bib_view&ean=9783839438886

Further references

Cohen, J. (1989). Deliberation and democratic legitimacy. In A. P. Hamlin & P. Pettit (Hrsg.), The good polity: Normative analysis of the state (S. 67–92). Blackwell.

Díaz Noci, J., Domingo, D., Masip, P., Micó, J. L. & Ruiz, C. (2012). Comments in news, democracy booster or journalistic nightmare: Assessing the quality and dynamics of citizen debates in Catalan online newspapers. #ISOJ, 2(1), 46–64. https://isoj.org/wp-content/uploads/2016/10/ISOJ_Journal_V2_N1_2012_Spring.pdf#page=46

Dryzek, J. S., Bächtiger, A., Chambers, S., Cohen, J., Druckman, J. N., Felicetti, A., Fishkin, J. S., Farrell, D. M., Fung, A., Gutmann, A., Landemore, H., Mansbridge, J., Marien, S., Neblo, M. A., Niemeyer, S., Setälä, M., Slothuus, R., Suiter, J., Thompson, D. & Warren, M. E. (2019). The crisis of democracy and the science of deliberation. Science (New York, N.Y.), 363(6432), 1144–1146. https://doi.org/10.1126/science.aaw2694

Engelke, K. M. (2019). Enriching the Conversation: Audience Perspectives on the Deliberative Nature and Potential of User Comments for News Media. Digital Journalism, 8(4), 1–20. https://doi.org/10.1080/21670811.2019.1680567

Esterling, K. M. (2011). “Deliberative Disagreement” in U.S. Health Policy Committee Hearings. Legislative Studies Quarterly, 36(2), 169–198. https://doi.org/10.1111/j.1939-9162.2011.00010.x

Fishkin, J. S. (1991). Democracy and deliberation: New directions for democratic reform. Yale University Press. http://www.jstor.org/stable/10.2307/j.ctt1dt006v https://doi.org/10.2307/j.ctt1dt006v

Friess, D. & Eilders, C. (2015). A systematic review of online deliberation research. Policy & Internet, 7(3), 319–339. https://doi.org/10.1002/poi3.95

Graham, T. & Witschge, T. (2003). In Search of Online Deliberation: Towards a New Method for Examining the Quality of Online Discussions. Communications, 28(2). https://doi.org/10.1515/comm.2003.012

Habermas, J. (2015). Between facts and norms: Contributions to a discourse theory of law and democracy (Reprinted.). Polity Press.

Price, V. & Cappella, J. N. (2002). Online deliberation and its influence: The Electronic Dialogue Project in Campaign 2000. IT&Society, 1(1), 303–329. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.9.5945&rep=rep1&type=pdf

Steenbergen, M. R., Bächtiger, A., Spörndli, M. & Steiner, J. (2003). Measuring Political Deliberation: A Discourse Quality Index. Comparative European Politics, 1(1), 21–48. https://doi.org/10.1057/palgrave.cep.6110002

Ziegele, M. (2016). Nutzerkommentare als Anschlusskommunikation: Theorie und qualitative Analyse des Diskussionswerts von Online-Nachrichten [The Discussion Value of Online News. An Analysis of User Comments on News Platforms]. Springer VS.

Published

2022-11-29

How to Cite

Heinbach, D. (2022). Rationality (Online Discussions/Discussion Quality). DOCA - Database of Variables for Content Analysis, 1(5). https://doi.org/10.34778/5t

Issue

Database

User-Generated Media Content