This study explores relationships between Facebook reactions and commenting frequency through the lens of a random sample of broadcast and newspaper news posts (n = 2,632) about COVID-19 in the United States. A main finding is that reactions on COVID-19 news posts in general — and the “like” and “angry” in particular — predict greater likelihood of comments on these posts. The topic of the post also plays an important role in both reactions and commenting frequency, suggesting people use these engagement tools in differing ways. Findings suggest that people may process reactions as heuristic cues that help them assess how much attention to pay to posts on Facebook.
Facebook’s “click-based” social reactions (Freeman, et al., 2020) — such as the upward-facing thumb dubbed a “like” or the red-faced “angry” — are potent tools of online engagement. Reactions offer the public a means to communicate opinions (Kim, 2014), imbue messages with social meaning (Giuntini, et al., 2019) as a stand-in for facial cues absent in online discussions (Derks, et al., 2007), and demonstrate that a post is capturing attention (Carr, et al., 2018). At the same time, commenting on Facebook news posts offers the public an avenue to engage with the news and other online users in an interactive way (Blassnig and Wirz, 2019; Guo and Sun, 2020; Salgado and Bobba, 2019).
This study considers reactions and comments on posts about the COVID-19 pandemic to provide a topic with high interest and urgency through which to understand relationships between commenting frequency and social reactions. This study was situated in the United States because it has consistently led the world in coronavirus cases and deaths, with 88.03 million cases and 1.01 million deaths at this writing , even though it was not the original epicenter of the disease (Holshue, et al., 2020). Also, the U.S. ranked poorly compared to other countries at handling the pandemic, despite its wealth, and lacked a cohesive national health communication plan in the early days of the crisis (Noar and Austin, 2020) when data for this study were collected. The nonpartisan Lowy Institute, an independent think tank in Australia, ranked the United States 94th worst out of 98 countries it examined in terms of cases, deaths, and testing in the early pandemic . Finally, the U.S. was a focal point of rampant pandemic misinformation that gripped the world in the early pandemic, facilitated in part by Donald J. Trump, U.S. President at the time (Evanega, et al., 2020). As a result of these factors, examining commenting and reaction behaviors related to COVID-19 are particularly meaningful in the United States.
Understanding how reactions and comments work together is important because both forms of engagement are so plentiful online as to be nearly ubiquitous parts of the news-consumption experience. The variety of reactions Facebook has offered since 2016 (Tian, et al., 2017) — including the “haha” and the “wow” — give the public a variety of emotions to communicate online, and influence how the platform surfaces content in news feeds (Bell, 2017). Yet most research has focused on the “like” (Blassnig and Wirz, 2019; Guo and Sun, 2020; Heiss, et al., 2019; Kim and Yang, 2017; Salgado and Bobba, 2019; Wohn, et al., 2016) or only some reactions (Masullo and Kim, 2021), not the whole array (see Larsson, 2018, for an exception). Research has focused on the “like” because it was the original reaction on Facebook, and (Wisniewski, et al., 2020) it is consistently the most frequently used (Tian, et al., 2017).
But examining other reactions is vital because Facebook has revealed that it gives more priority to content with other reactions — “love,” “haha,” “sad,” “angry,” and “hug” — over content with a “like” in assessing which content to surface in news feeds (Bell, 2017). Notably, Facebook says it weighs these other reactions equally, with all higher than the “like” (Bell, 2017). Thus, this study fills a gap in the literature by examining the role of specific reactions, as well as relationships between social reactions posted on newspaper and broadcast news posts on Facebook and commenting frequency on these posts. Using a randomly selected sample of posts from news organizations across the United States (n = 2,632), this study demonstrates that reactions in general — and the “like” and “angry” in particular — are correlated with greater likelihood of comments on these posts. The topic of the post also plays an important role in commenting frequency.
Since discussions on social-networking sites emerged in the late 1990s (boyd and Ellison, 2007; Reagle, 2015), they have grown in popularity, as social media platforms like Facebook proliferated. Discussions about news, the focus of this study, are particularly important because they offer the normative benefit of helping people feel more involved with news (Oeldorf-Hirsch and Sundar, 2015), which may lead to political or social engagement (Vaccari and Valeriani, 2018). Increasingly, these conversations about news are taking place on Facebook, as many news organizations shift their commenting to this platform to capitalize on the growing Facebook news audience (Hille and Bakker, 2014; Su, et al., 2018). The Facebook news audience is important to understand because it has the power to share news content (Tandoc and Vos, 2016), potentially helping newsrooms expand their reach. More than half of American adults (53 percent) get their news from social media at least sometimes (Shearer and Mitchell, 2021). Notably, news organizations spend considerable resources trying to woo this audience (Ju, et al., 2014). For these reasons, I focused on local broadcast and newspaper news posts on Facebook.
Context of COVID-19 pandemic
This study focuses on news content on Facebook about COVID-19 because pandemic news offers a meaningful context in which to consider whether social reactions on news posts were correlated with whether people commented on those posts. The reason is the pandemic led to a spike in news consumption, particularly for online and TV news, and increased news trust at a time when news trust is generally lagging, particularly in the U.S. (Newman, et al., 2020). In the U.S., one in six Americans reported paying attention to COVID-19 news and almost half (46 percent) identified their local media outlet as their source (Shearer, 2020). Indeed news consumption, especially online and on TV, saw a “coronavirus bump”  that temporarily reversed decades of decline (Meyer, 2004). This demonstrates the importance of the pandemic as a news event, making it particularly suitable for this study. Notably, local news was the focus of this study because the U.S. lacked a comprehensive national response to the pandemic (Noar and Austin, 2020) and instead relied on a patchwork of state and local efforts, which would be more likely to be highlighted in local news posts.
Furthermore, the pandemic fomented a swirl of conflicting emotions, from hope to disgust (Dubey, 2020), offering an opportunity to examine the range of Facebook’s social reactions. Notably, Facebook introduced its seventh reaction, the “hug,” amid the early pandemic (Guynn, 2020), seemingly to give people another emotional outlet during a tumultuous period.
Additionally, a context such as the COVID-19 pandemic was suitable for this study in particular because it is the type of news topic, negative and unexpected, that has been shown to elicit engagement (Salgado and Bobba, 2019; Tenenboim and Cohen, 2015).
Importantly, COVID-19 is a topic likely to generate audience engagement because of how the pandemic was politicized in the news, with political parties in the U.S. having very different perspectives on the crisis (Hart, et al., 2020). Conspiracy theories about how the virus originated, was spread, and how it should be handled (Evanega, et al., 2020; Uscinski, et al., 2020) were rampant on Facebook (Bruns, et al., 2021) and other platforms and amplified by right-wing media (Motta, et al., 2020). Republicans tended to embrace conspiracy theories about the virus perpetuated by Trump, their party leader at the time, that led them to be less likely to follow COVID-19 health guidelines, such as mask-wearing (Latkin, et al., 2021a) or to refuse the COVID-19 vaccine (Latkin, et al., 2021b).
This created an environment, at least in the United States, that might fuel visceral responses to news posts about the virus and generate polarizing and conspiratorial comments. This context is important to consider even though the study focused on news posts from local news outlets — that is regional or local affiliates of network TV news stations or community or city newspapers — not general-interest posts about COVID-19. Thus, while the topic of COVID-19 certainly generated misinformation and conspiracy theories online, these posts were focused on news.
Social reactions as heuristic cues
Social reactions are sometimes conceptualized as paralinguistic digital affordances (PDAs; Hayes, et al., 2016; Spottswood and Wohn, 2019; Wohn, et al., 2016), and they are an important area for understanding how people use social media. They were created for the public to interact with social media content, but they have evolved into potent social tools that convey attention (Carr, et al., 2018) and emotion (Giuntini, et al., 2019). Social reactions also generate data for social media companies, drive traffic to posts (Gerlitz and Helmond, 2013), and enable people to manage their online identities (Ozanne, et al., 2017). The “like” has been on Facebook since 2009 (Wisniewski, et al., 2020) and remains the most popular reaction by a wide margin (Tian, et al., 2017). Facebook expanded reactions to the “love,” “haha,” “wow,” ”sad,” and “angry” in 2016 (Tian, et al., 2017; Wisniewski, et al., 2020), and specific reactions may express complex emotional experiences with varied valence.
Research suggests different reactions have distinct meaning to people. For example, people use the “like” to signal they have noticed a post without expressing a specific emotion (Hayes, et al., 2016; Spottswood and Wohn, 2019) and to convey social support (Wohn, et al., 2016) or a positive response (Porten-Cheé, et al., 2018). The “haha” can convey surprise or joy, and the “wow” generally shows surprise, although this can be negative, positive, or neutral (Giuntini, et al., 2019). The “angry” and the “sad” consistently demonstrate negative emotion (Giuntini, et al., 2019). Yet, much research has focused on the “like,” so much less is known about the other reactions.
People perceive reactions as primarily expressive tools that may encourage online discussions (Kim, 2014). Indeed, Facebook reactions and comments on posts have been found to be meaningful ways to understand people’s emotional responses to online content (Tian, et al., 2017), and the public appreciates having so many options for reactions (Wisniewski, et al., 2020). Reactions are also a way for the public to assesses whether a post has been successful in reaching an audience (Carr, et al., 2018) or to signal controversial news posts (Basile, et al., 2017). The “like” in particular may serve as a popularity cue (Porten-Cheé, et al., 2018) to draw attention to a post. Indeed, “likes” have been found to be related to positive emotions in posts (Heiss, et al., 2019), although other research finds them as more neutral markers that people have noticed a post (Hayes, et al., 2016; Spottswood and Wohn, 2019). Reactions imbue emotions in similar ways to emoji (Fleuriet, et al., 2014), but differ because emoji are more frequently used in dyadic communication from one user to another. In contrast, reactions offer an aggregated sense of emotion about a post or article to other users (Wisniewski, et al., 2020). Reactions (Masullo and Kim, 2021), like emoji (Fleuriet, et al., 2014), may change how people perceive messages.
The heuristic-systematic model of information processing (HSM; Chaiken, 1980) offers a useful way to understand reactions on social media and their potential relationship with commenting. HSM proposes that when people are confronted with messages, they make split-section decisions on how to process them. If the content is important to them, they process it systematically, expending effort to assess whether the message is valid and has a clear argument (Chaiken, 1980). But when the content is less important, they process it heuristically, relying on cues from their past knowledge called “cognitive heuristics” — to assess the content (Chaiken, 1980). These cues trigger a train of thought that helps people make sense of the content quickly (Sundar, 2008; Chaiken, 1980).
In the context of news, heuristic processing is particularly likely because people are confronted with so much content that they need to quickly sort through it to decide what deserves their attention (Chen and Chen, 2020; Holton and Chyi, 2011). Scholars have conceptualized heuristics in various ways in relation to media content. For example, scholars have conceptualized all the following as heuristics: uncivil comments posted on news stories (Prochazka, et al., 2018), how news is shared (Sundar, 2008), media brands (Urban and Schweiger, 2014), video quality (Chen, et al., 2017), labels of “opinion” or “news” on stories (Peacock, et al., 2022), and boxes beside a story that explain how it was reported (Masullo, et al., 2021).
Heuristics allow people to preserve their limited cognitive energy by relying on mental shortcuts and past knowledge, rather than in-depth thoughts, to make assessments (Metzger and Flanagin, 2013). The way this works is the cue conveys to the audience something about the content, such as that it is credible (Chen, et al., 2017; Masullo, et al., 2021; Sundar, 2008) or has high journalistic quality (Prochazka, et al., 2018), based on people’s prior understanding of that cue. Then people make assessment about the content based on that prior understanding.
In this study, I translate this idea to Facebook reactions, positing that whether a news story has reactions and the type of reaction (e.g., “like,” “angry,” etc.) may both operate as cognitive heuristics conveying to the audience something about the content that is related to whether people comment on a post or not. Thus, I argue that the reaction serves as a heuristic cue of whether the story is worth commenting upon. Support for this argument comes from research that shows reactions may operate as heuristics. The “angry,” for example, may operate as a heuristic that cancels out the effects of incivility in online corrections of misinformation (Masullo and Kim, 2021). Similarly, the “like” may be a cue signaling that a post or comment has value (Kluck, et al., 2019; Naab, et al., 2020; Porten-Cheé, et al., 2018; Waddell, 2020). Social reactions are likely to trigger either bandwagon (Sundar, et al., 2007) or endorsement (Metzger and Flanagin, 2013) heuristics. Both heuristics depend on the idea that the reactions tell people what others think about the post to encourage them to mirror that thinking. So a “like” on a post cues people to also feel more likely to like it (Sundar, et al., 2007). Of course, COVID-19 is a unique topic in many ways because people may already have strong pre-existing attitudes regarding it that shape their behaviors (Latkin, et al., 2021a). However, this does not undermine the theoretical expectation that they would process reactions on COVID-19 news posts heuristically. If anything, having strong pre-existing attitudes would increase the likelihood that people would respond to posts heuristically, without deep systematic thought processes.
To understand whether reactions may operate as heuristics that trigger people to comment on news stories about COVID-19, I first considered the frequency of the different types of reactions on news posts that had comments. Research has shown great variability in how frequently each reaction gets used (Blassnig and Wirz, 2019), supporting the premise of the following research question.
RQ1: How frequent are reactions (“like,” “love,” “hug,” “haha,” “wow,” “sad,” and “angry”) on Facebook local news posts about COVID-19 that have comments?
Then, based on the literature on heuristic processing discussed above, I predicted that reactions on stories would operate as heuristics that make it more likely that people would also comment on a story. The over-arching rationale for this prediction is that reactions can be emotional responses (Giuntini, et al., 2019; Tian, et al., 2017). Commenting has been found to have more cognitive roots (Kim and Yang, 2017) that operate differently than reactions (Salgado and Bobba, 2019). Thus, it is likely that emotions embedded in reactions might be related to the cognitive response of commenting (Marcus, et al., 2000), and reactions may encourage online discussions (Kim, 2014; Larsson, 2018). Furthermore, research in various contexts has shown that reactions and comments may differ in frequency on Facebook posts (Blassnig, et al., 2019; Bobba, 2019). This supports a hypothesis predicting a positive relationship between reactions and commenting. Notably, I am predicting a correlation between reactions and comments without any indication of causal order.
H1: There will be a positive relationship between reactions on Facebook local news posts about COVID-19 and commenting on those posts.
Next, I delved into how specific reactions may operate differently, based on the emotional responses they generate. The justification for this inquiry is research that shows different reactions link to different emotional responses. The “like,” for example, can convey a positive (Porten-Cheé, et al., 2018) or neutral response (Hayes, et al., 2016; Spottswood and Wohn, 2019). The “angry” and “sad” are linked to negative emotions, and the “wow” conveys surprise (Giuntini, et al., 2019). As a result, it is possible that specific reactions would be more or less likely to be on stories with frequent comments. The research on whether reactions are related to commenting on Facebook posts is mixed. Some research shows positive relationships between the “like” and the “angry” and commenting, but other reactions, such as the “wow” showed negative relationships (Larsson, 2018). Other studies found no relationship between reactions and commenting (Blassnig and Wirz, 2019). Thus, given the conflicting evidence, a research question was posed:
RQ2: Are specific reactions (“like,” “love,” “hug,” “haha,” “wow,” “sad,” and “angry”) associated with likelihood that Facebook local news posts about COVID-19 have comments?
Research is more consistent that the topic of a post may be related to engagement behaviors, such as commenting and reactions (Blassnig, et al., 2019; Blassnig and Wirz, 2019; Guo and Sun, 2020; Tian, et al., 2017; Salgado and Bobba, 2019). Yet, the role that topics play has not been fully unpacked because only a relatively few topics have been studied. Facebook posts with populist content (Blassnig, et al., 2019; Blassnig and Wirz, 2019), about unexpected and national-level news events (Salgado and Bobba, 2019), and political news (Guo and Sun, 2020) are more likely to receive comments and reactions. Political and crime news also elicit more comments (Tenenboim and Cohen, 2015), but negatively valenced support-seeking messages were found to receive fewer reactions and comments, compared to mixed-valence messages (Li, et al., 2020). This research offers some evidence that controversial or negative news may generate more engagement, such as comments and reactions, but it also suggests the opposite may be the case. Further, research suggests that people employ reactions and comments in differing ways (Blassnig, et al., 2019; Bobba, 2019), offering support for a research question considering what role the topic of a post plays in whether a post receives comments or reactions.
However, local news posts about COVID-19 are dissimilar from these earlier studies in substantial ways. First, the pandemic is an over-arching news event with a worldwide scope that would logically have greater impact on people than a typical political or crime story, and pandemic news is certainly markedly different than support-seeking messages described above (Li, et al., 2020). Second, some aspects of COVID-19 news might be perceived as controversial (e.g., information about vaccines or masks) because of debates about them (Largent, et al., 2020; Latkin, et al., 2021a, 2021b). Others, such as canceled events, may be more benign. Thus, while research is instructive in suggesting that topic would likely play a role in whether news posts received comments or reactions, it does not provide a clear direction on how that would transpire.
RQ3: Do the topics about COVID-19 that elicited the most comments differ from the topics that elicited the most reactions?
To answer the research questions and test the hypothesis, local news posts on Facebook were collected across the United States at three points in the early pandemic to provide a cross-section of discussions about the crisis. These points were 23 March 2020, soon after the pandemic was declared (Cucinotta and Vanelli, 2020), a month later, and 23 June 2020, just as cases were climbing again. Using CrowdTangle, a Facebook-owned public insights tool (CrowdTangle, 2020), posts were collected from a random selection of the regional or local affiliates of the four major television news stations in the U.S. (ABC, CBS, NBC, and FOX) and from local newspapers in each of the 50 state capitals. This generated a total sample of 3,554 posts from newspapers, and 22,029 posts from TV stations. Data for these three points were collapsed into one sample because the different time periods were not relevant to this study’s research questions or hypothesis.
Because the study focuses on local news about COVID-19, the first step was to determine which of the posts in the overall sample were local news about the coronavirus. Posts were judged to be about coronavirus (1 = yes, 0 = no) if they noted the pandemic, the virus, social distancing, the quarantine, shutdown, or related terms. Then coronavirus posts were further coded for whether they were about local news (1 = yes, 0 = no), defined as stories that dealt with local or state issues where the newspaper was published or the affiliate station was located; stories about national news were not considered local. Before this coding began, intercoder reliability was assessed using a subset of the sample, and for both variables the Krippendorff’s alpha coefficients were reliable, ranging from 0.81 to 0.94 (Table 1).
Table 1: Krippendorff’s alpha coefficients for intercoder reliability for whether posts were about coronavirus and local news. Data collection Sample size COVID-19 Local Newspaper posts TV posts 23 March 250 250 0.93 0.86 23 April 130 160 0.93 0.86 23 June 50 50 0.94 0.81
This process yielded a final sample of 2,632 local news posts about COVID-19. These posts were subsequently coded into 15 topics that reflect a cross-section of news about the virus. Before beginning this coding, intercoder reliability was assessed on 528 posts (roughly half from newspapers and half from TV stations), and Krippendorff’s alpha coefficients for almost all the variables were acceptable (Riffe, et al., 2019), ranging from 0.76 to 0.95 (Table 2). The one exception was fact checking. Intercoder reliability could not be calculated for that variable because no fact checking posts were in our reliability subsample, but coders did agree 100 percent of the time that this topic was absent.
Table 2: Coding categories, description, and Krippendorff’s alpha for intercoder reliability.
Note: + Intercoder reliability could not be calculated for this variable because no fact checks were included in the subsample, but two coders had 100 percent agreement that the category was absent.
Topic Code “yes” if it includes the following Pooled reliability Positive cases Focus on positive tests or number of infections 0.84 Deaths Total deaths or specific death of a person 0.95 Affected groups Effects of virus on subpopulations (e.g., homeless, senior citizens, etc.) 0.82 Testing information How to get tested, who can get tested, test availability 0.76 Hospitals How hospital are preparing or managing cases 0.83 Schools Effects on elementary, middle, and high schools, colleges and universities 0.93 Restaurants Effects on restaurants and bars, including curbside pickup 0.93 Canceled events Reporting of cancelations related to the virus 0.89 Related crime Crime related to the virus (e.g., stolen toilet paper) 0.81 Grocery stores Response of local grocery stores, including grocery sections of local Target, Costco, etc. 0.94 How to help What residents can do to help their communities 0.91 Gov’t response Local and state government response, including restrictions 0.82 Fact checks+ Stories that fact-check misinformation 0.00 Businesses Response of gyms, child care center, gun shops, and other businesses (except grocery stores) 0.80 Economic effects Projections about local economy, unemployment, or profits 0.79
Reactions, comments, and topics
CrowdTangle (2020) provides Facebook post data with the frequency of comments and reactions (“like,” “love,” “hug,” “haha,” “wow,” “sad,” and “angry”) that each post received. These became key variables for this study. The 15 pandemic topics were considered control variables in analyses for H1 and RQ2 because different news topics are related to whether people will post reactions or comment on a post (Guo and Sun, 2020; Salgado and Bobba, 2019). However, topics were focal independent variables in RQ3.
RQ1 asked how frequently the different type of Facebook reactions (“like,” “love,” “hug,” “haha,” “wow,” “sad,” and “angry”) were posted on Facebook local news posts about COVID-19 that have comments on them. To answer this, first the sample was filtered to include only posts with comments. This resulted in n = 2,227. Given the highly skewed distributions of reaction data, these variables were transformed using logarithmic 10 (Tabachnick and Fidell, 2007). A related samples Friedman’s two-way analyses of variance by rank was conducted using the logged variables and a Bonferroni correction to control for repeated tests, and the overall analysis was significant, 6502.92, p < .001. This test ranks the variable means, and tests significance between the rankings.
As shown in Figure 1, “likes,” the most long-standing reaction, were ranked significantly more frequently (6.72) than any of the other reactions. “Hug” reactions (2.08), introduced in April 2020 (Guynn, 2020) in the midst of data collection for this study, ranked significantly less frequently than any other reaction. “Sad” reactions (4.11) were significantly less frequent than “likes,” significantly more frequent than “angry” (3.65), “haha” (3.49), and “hug,” and not different from “wow” (4.03) or “love” (3.93). It is informative that “sad” reactions, one of the more negative of the seven options (Giuntini, et al., 2019), were frequent during the pandemic, a particularly troublesome period in society.
Figure 1: Mean ranks of logarithmic 10 transformed reactions on posts with comments based on related samples Friedman’s two-way analysis of variance test for non-parametric data. All rankings are significantly different except between “haha” and “angry,” “love” and “wow,” “love” and “sad,” and “wow” and “sad.”
H1 predicted a positive relationship between reactions on Facebook local news posts about COVID-19 and commenting frequency. A total reaction score was created by summing all reactions, and this composite score was logarithmic 10 transformed because it was highly skewed (Tabachnick and Fidell, 2007). Because commenting frequency, the dependent variable, is a count with an over-dispersed distribution, negative binomial regression was used to test H1 (Guo and Sun, 2020; Heiss, et al., 2019). The 15 topics were entered as controls. An initial omnibus test showed that reactions [χ2 = 1,229.90, p < .001] significantly improved model fit over a null model. Confirming H1, reactions showed a significant positive relationship with commenting frequency [β = 2.56, incidence rate ratio (IRR) = 12.99, p < .001]. As shown in Table 3, posts about government response to the pandemic were more likely to receive comments, regardless of whether they had reactions or not [β = 0.46, IRR = 1.33, p < .001], while posts about coronavirus-related crime, businesses, schools, restaurants, and canceled events were less likely to receive comments, regardless of whether they had reactions or not.
Table 3: Coefficients for negative binomial regression model examining relationships between reactions and news topics on commenting frequency.
Note: IRR = incidence rate ratios; + Variable is logarithmic 10 transformed because of skew.
β SE IRR P value Total reactions+ 2.56 0.07 12.99 <.001 Topics Gov’t response 0.46 0.09 1.58 <.001 Business -0.51 0.12 0.60 <.001 Positive cases -0.08 0.11 0.92 .46 Economic effects 0.15 0.13 1.12 .25 Schools -0.67 0.14 0.51 <.001 Affected groups -0.11 0.14 0.89 .41 Deaths 0.01 0.16 1.01 .98 Testing information -0.03 0.17 0.97 .87 Hospitals -0.22 0.16 0.80 .16 Restaurants -0.62 0.25 0.54 .01 Canceled events -1.07 0.27 0.34 <.001 Grocery stores 0.00 0.24 1.00 .99 How to help -0.40 0.26 0.67 .11 Related crime -0.78 0.26 0.46 .003 Fact checks -0.95 0.55 0.39 .08
RQ2 asked whether specific reactions (“like,” “love,” “hug,” “haha,” “wow,” “sad,” and “angry”) were associated with commenting frequency. Commenting frequency was regressed on the seven reactions, again using negative binomial regression, and the 15 topics were entered as controls. The reactions were transformed using logarithmic 10 because of high skew (Tabachnick and Fidell, 2007). The “hug” reaction was so infrequent in the data, that the SPSS statistical software dropped it from the model. An initial omnibus test showed that the reactions collectively [χ2 = 1367.28, p < .001] significantly improved model fit over a null model. Answering, RQ2, results showed that “like,” “angry,” “sad,” “haha,” and “love,” all had positive relationships with commenting frequency, with the “like” [β = 1.72, IRR = 5.58, p < .001] and the “angry” [β = 0.85, IRR = 2.33, p < .001] showing the largest effects (Table 4). Posts about government response to the pandemic were more likely to receive comments regardless of whether they had any reactions or not [β = 0.23, IRR = 1.26, p < .001]. Posts about coronavirus-related crime, businesses, restaurants, schools, and canceled events were less likely to receive comments, regardless of reactions.
Table 4: Coefficients for negative binomial regression model examining relationships between reactions and news topics on comment frequency.
Note: IRR = incidence rate ratios; + Variables are logarithmic 10 transformed because of skew. The SPSS statistical software dropped the “hug” reaction from the model because it appeared too infrequently in the dataset.
β SE IRR P value Reactions Like+ 1.72 0.12 5.58 <.001 Angry+ 0.85 0.09 2.33 <.001 Sad+ 0.71 0.08 2.04 <.001 Haha+ 0.56 0.13 1.75 <.001 Love+ 0.24 0.09 1.23 .01 Wow+ -0.19 0.11 0.82 .07 Topics Gov’t response 0.23 0.09 1.27 .01 Business -0.41 0.11 0.67 <.001 Positive cases 0.15 0.12 1.16 .84 Economic effects 0.20 0.12 1.22 .11 Schools -0.54 0.13 0.58 <.001 Affected groups -0.13 0.12 0.87 .28 Deaths 0.03 0.16 1.03 .84 Testing information 0.12 0.16 1.13 .46 Hospitals -0.15 0.15 0.86 .32 Restaurants -0.47 0.23 0.62 .04 Canceled events -1.08 0.25 0.34 <.001 Grocery stores 0.04 0.22 1.04 .88 How to help -0.34 0.23 0.71 .15 Related crime -0.87 0.26 0.42 .001 Fact checks -0.63 0.51 0.22 .53
RQ3 asked whether the topics about COVID-19 that elicited the most comments differed from the topics that elicited the most reactions. To answer this, data were sorted to include only posts for each topic. Then, due to the skewed nature of the data, a median score was calculated for comments per post and reactions per post for each topic. This was done so comparisons could be made even though posts about some topics were more frequent in the overall sample.
As shown in Figure 2, posts about grocery stores, restaurants, and affected groups (e.g., senior citizens or homeless populations) elicited the most comments, while posts that were fact checks, ways to help, and canceled events received the fewest comments. In comparison, posts about fact checks, government response, positive cases, and deaths garnered the most reactions while posts about ways to help, canceled events, and schools received the fewest reactions. Thus, these findings suggest that how the public uses comments and reactions differs in some substantial ways.
Figure 2: Medians for comments and reactions per post for each of 15 news topics.
Facebook’s array of seven reactions and the option to comment offer potent forms of engagement for people to connect with the news. This study sought to advance understanding of how reactions and comments are related to each other in the context of a particularly newsworthy period, the COVID-19 pandemic, in the U.S., a country that leads the world in deaths and illnesses from the virus. An informative finding is that while “likes,” the most long-standing social reaction were unsurprisingly plentiful in the data, the “sad” reaction, one of the most negative reactions (Giuntini, et al., 2019) was next most frequent. This suggests that other reactions are at least beginning to gain the prominence of the “like” and that there is great variability in how people use reactions (Blassnig and Wirz, 2019).
It also confirms that people may be using these reactions to express complex emotional experiences (Wisniewski, et al., 2020) guided by actual emotions, suggesting that specific reactions serve as heuristics to specific pre-existing emotional experiences. The finding suggests using the “sad” reaction may serve as a heuristic of pre-existing understandings of feeling sad, the very emotion one might expect during the depressing period of the pandemic. Similarly, the “angry” reaction may be a heuristic of people’s pre-existing feeling of being upset or angry. Theoretically, this provides new knowledge that heuristics can operate at the emotional, not just cognitive level. This is important because in the fast-paced discussions online, where people may not take time to contemplate, the power of reactions as emotional heuristics may play an outsized role in cuing other users regarding how to perceive a post on Facebook or even color their response to that post. This is particularly troubling regarding a topic like COVID-19 that has fueled rampant misinformation and conspiracy theories (Bruns, et al., 2021; Evanega, et al., 2020; Uscinski, et al., 2020) that would likely heighten people’s emotional responses to any posts on this topic.
Another notable finding is that the more reactions there were on news posts about the pandemic, the more comments these posts were likely to have, suggesting they operate as endorsement (Metzger and Flanagin, 2013) heuristics. This offers several theoretical contributions. First, it supports earlier research that shows that reactions on posts may encourage online conversations (Kim, 2014; Larsson, 2018), although this study cannot parse the causal order of whether comments or reactions came first. Second, it suggests that reactions and comments are forms of engagement with distinct yet related motivations (Kim and Yang, 2017; Salgado and Bobba, 2019) that may be used in differing ways (Blassnig, et al., 2019; Bobba, 2019). Third, these findings show that reactions linked to the more cognitive engagement experience of commenting (Kim and Yang, 2017) and be related to greater frequency of commenting (Larsson, 2018). This finding offers new theoretical knowledge about heuristics by offering support for the notion that reactions may operate as these cues (Masullo and Kim, 2021) and that people may process reactions without systematic thought. Further, the finding that “like” and “angry” reactions were most strongly related to commenting frequency is informative. It underscores which aggregated emotional responses are most linked to the more engaging (Guo and Sun, 2020) commenting behavior. It also offers broader implications about how news that is divisive or troubling enough to warrant an “angry” may drive interactions online.
In addition, findings regarding the topics of pandemic news are instructive. Posts about government response to the pandemic were most likely to receive comments, regardless of whether they received reactions or not, while other topics made comments less likely or had no significant effects. This suggests that people may have been more interested in news about government shutdowns or masks mandates and have paid less attention to other topics. It is also notable that even though reactions overall and specific reactions had positive relationships with commenting frequency, the topic of government response produced a strong positive effect. This highlights the necessity to consider news topics in understanding relationships between reactions and commenting because these engagement features may operate differently with varying topics (Blassnig, et al., 2019; Blassnig and Wirz, 2019; Guo and Sun, 2020; Salgado and Bobba, 2019). It also suggests that while reactions may be heuristics to the public that a post is worthwhile or valuable, the topic of the article can may also be such a cue. Scholars should be sure to consider post topic in studies examining reactions and commenting.
Finally, the finding that established different patterns for reactions and comments is illuminating. Overall, commenting was more frequent than reactions, although some topics received more comments (e.g., those about grocery stores, restaurants, and affected groups, such as senior citizens), while other topics received more reactions (e.g., posts about fact checks, government response, positive cases, and deaths). These findings suggest that the way people use reactions is decidedly different than what makes them want to comment (Kim and Yang, 2017; Salgado and Bobba, 2019), but more research is needed on more topics to provide a fuller picture of how this works. Overall, these findings help us understand and interpret the role of both comments and reactions, showing that they are part of the architecture of online discussions, not merely tools to generate data for social media platforms or drive traffic (Gerlitz and Helmond, 2013).
However, some limits on the generalizability of this study should be noted. While news posts on COVID-19 were considered in this study because of the high interest and audience engagement they would likely generate, it is clearly not like other topics. Thus, the findings regarding reactions on COVID-19 news posts acting as heuristic cues of specific emotional experiences should be treated with some caution, as these reactions may operate differently in a less-explosive news context that is not as polarizing (Hart, et al., 2020) or rife with conspiracy theories and misinformation (Evanega, et al., 2020; Uscinski, et al., 2020). Yet, the relevance of these findings remains because the COVID-19 pandemic was and continues to be a transformative crisis worldwide. So even if these findings cannot generalize to other topics, understanding how reactions and comments operate in such a pivotal news topic provides important knowledge in its own right. Furthermore, the study was conducted using Facebook news posts, so the full array of Facebook reactions could be considered. But relationships between reactions and commenting my differ on other platforms, such as Twitter, that have fewer reaction options.
Overall, this research makes four main contributions. First, Facebook’s reactions may serve as heuristics, encouraging comments on news posts, although the topic of the post matters regarding whether people will comment or not and this study could not establish causal order. Second, reactions are not monolithic. Rather, they seem to serve as specific heuristics of pre-existing emotional experiences (Wisniewski, et al., 2020) and provide a means to communicate complex experiences. Third, the array of seven reactions that Facebook uses may play different roles. The “like” is certainly predominant, but the “sad” is popular for conveying negative emotion. The “like” and the “angry” are related to commenting. Fourth, commenting and reactions may play different roles in user engagement with some posts encouraging one over the other.
These findings point to several areas for fruitful research. The “hug” reaction, which was introduced in the midst of data collection for this study (Guynn, 2020), deserves greater study. It was too infrequent in these data to be considered, but its role later in the pandemic or in other contexts could be informative. In addition, survey and interview research could help augment content analysis studies such as this to increase our understanding of reactions and to uncover why people picked one reaction over another or why they chose to comment on one post versus another one (e.g., Hayes, et al., 2016).
About the author
Gina M. Masullo (Ph.D., Syracuse University) is Associate Director of the Center for Media Engagement and an Associate Professor in the School of Journalism and Media, both at the University of Texas at Austin. Her research focuses on how the digital space both connects and divides people and how that influences society, individuals, and journalism. She is the author of Online incivility and public debate: Nasty talk ( Palgrave Macmillan, 2017) and The new town hall: Why we engage personally with politicians (Praeger, 2020) and co-editor of Scandal in a digital age (Palgrave Macmillan, 2016). She spent 20 years as a newspaper journalist before becoming a professor. Her research has been published in the Journal of Broadcasting & Electronic Media, New Media & Society, Journalism, and Computers in Human Behavior, among other journals.
E-mail: Gina [dot] masullo [at] austin [dot] utexas [dot] edu
This is a project of the Center for Media Engagement that was made possible thanks to funding from the Democracy Fund, William and Flora Hewlett Foundation, and John S. and James L. Knight Foundation. Special thanks go to Natalie (Talia) J. Stroud, Gabrielle Chavez, Jessica Collier, Katalina Deaven, Natalie Deller, Jacob Gursky, Jay Jennings, Katie Joseff, Ashley Muddiman, Caroline Murray, Ellery Wadman-Goetsch, and Tamar Wilner for providing feedback and compiling and/or coding content for this.
1. As of 15 July 2022, 557.92 million people had contracted the disease worldwide, and 6.36 million had died, according to the World Health Organization: https://covid19.who.int/. The U.S. had 88.03 million cases and 1.01 million deaths from COVID-19 as of 15 July 2022, according to the WHO.
2. The Lowy Institute ranked the countries during the 36 days following the tenth confirmed case in that country. See more details: https://interactives.lowyinstitute.org/features/covid-performance/#rankings.
3. Newman, et al., 2020, p. 10.
Angelo Basile, Tommaso Caselli, and Malvina Nissim, 2017. “Predicting controversial news using Facebook reactions,” Proceedings of the Fourth Italian Conference on Computational Linguistics CLiC-it 2017, pp. 12–17.
doi: https://doi.org/10.4000/books.aaccademia.2370, accessed 1 August 2022.
Karissa Bell, 2017. “You might want to rethink what you’re ‘liking’ on Facebook now,” Mashable (27 February), https://mashable.com/article/facebook-reactions-news-feed, accessed 1 August 2022.
Sina Blassnig and Dominique S. Wirz, 2019. “Populist and popular: An experiment on the drivers of user reactions to populist posts on Facebook,” Social Media + Society (25 November).
doi: https://doi.org/10.1177/2056305119890062, accessed 1 August 2022.
Sina Blassnig, Sven Engesser, Nicole Ernst, and Frank Esser, 2019. “Hitting a nerve: Populist news articles lead to more frequent and more populist reader comments,” Political Communication, volume 36, number 4, pp. 629–651.
doi: https://doi.org/10.1080/10584609.2019.1637980, accessed 1 August 2022.
Guiliano Bobba, 2019. “Social media populism: Features and ‘likeability’ of Lega Nord communication on Facebook,” European Political Science, volume 18, number 1, pp. 11–23.
doi: https://doi.org/10.1057/s41304-017-0141-8, accessed 1 August 2022.
danah boyd and Nicole B. Ellison, 2007. “Social network sites: Definition, history, and scholarship,” Journal of Computer-Mediated Communication, volume 13, number 1, pp. 210–230.
doi: https://doi.org/10.1111/j.1083-6101.2007.00393.x, accessed 1 August 2022.
Axel Bruns, Edward Hurcombe, and Stephen Harrington, 2021. “Covering conspiracy: Approaches to reporting the COVID/5G conspiracy theory,” Digital Journalism (15 September).
doi: https://doi.org/10.1080/21670811.2021.1968921, accessed 1 August 2022.
Caleb T. Carr, Rebecca A. Hayes, and Erin M. Sumner, 2018. “Predicting a threshold of perceived Facebook post success via likes and reactions: A test of explanatory mechanisms,” Communication Research Reports, volume 35, number 2, pp. 141–151.
doi: https://doi.org/10.1080/08824096.2017.1409618, accessed 1 August 2022.
Shelly Chaiken, 1980. “Heuristic versus systematic information processing and the use of source versus message cues in persuasion,” Journal of Personality and Social Psychology, volume 39, number 5, pp. 752–766.
doi: https://doi.org/10.1037/0022-35184.108.40.2062, accessed 1 August 2022.
Gina M. Chen, Peter S. Chen, Chen-Wei Cheng, and Zainul Abedin, 2017. “News video quality affects online sites’ credibility,” Newspaper Research Journal, volume 38, number 1, pp. 19–31.
doi: https://doi.org/10.1177/0739532917696087, accessed 1 August 2022.
Victoria Y. Chen and Gina M. Chen, 2020. “Shut down or turn off? The interplay between news overload and consumption,” Atlantic Journal of Communication, volume 28, number 2, pp. 125–137.
doi: https://doi.org/10.1080/15456870.2019.1616738, accessed 1 August 2022.
CrowdTangle, 2020. “A tool from Meta to help follow, analyze, and report on what’s happening across social media,” at https://www.crowdtangle.com, accessed 1 August 2022.
Domenico Cucinotta and Maurizio Vanelli, 2020. “WHO declares COVID-19 a pandemic,” Acta Bio-Medica, volme 91, number 1, pp. 157–160.
doi: https://doi.org/10.23750/abm.v91i1.9397, accessed 1 August 2022.
Daantje Derks, Arjan E.R. Bos, and Jasper von Grumbkow, 2007. “Emoticons and social interaction on the Internet: The importance of social context,” Computers in Human Behavior, volume 23, number 1, pp. 842–849.
doi: https://doi.org/10.1016/j.chb.2004.11.013, accessed 1 August 2022.
Akash Dutt Dubey, 2020. “Twitter sentiment analysis during COVID-19 outbreak,” SSRN (9 April), at https://doi.org/10.2139/ssrn.3572023, accessed 1 August 2022.
Sarah Evanega, Mark Lynas, Jordan Adams, and Karinne Smolenyak, 2020. “Coronavirus misinformation: Quantifying sources and themes in the COVID-10 ‘infodemic’,” Cornell Alliance for Science, at https://allianceforscience.cornell.edu/wp-content/uploads/2020/09/Evanega-et-al-Coronavirus-misinformationFINAL.pdf, accessed 1 August 2022.
Christina Fleuriet, Megan Cole, and Laura Guerrero, 2014. “Exploring Facebook: Attachment style and nonverbal message characteristics as predictors of anticipated emotional reactions to Facebook postings,” Journal of Nonverbal Behavior, volume 38, number 4, pp. 429–450.
doi: https://doi.org/10.1007/s10919-014-0189-x, accessed 1 August 2022.
Cole Freeman, Hamed Alhoori, and Murtuza Shahzad, 2020. “Measuring the diversity of Facebook reactions to research,” Proceedings of the ACM on Human-Computer Interaction, volume 4, article number 12, pp. 1–17.
doi: https://doi.org/10.1145/3375192, accessed 1 August 2022.
Carolin Gerlitz and Anne Helmond, 2013. “The like economy: Social buttons and the data-intensive Web,” New Media & Society, volume 15, number 8, pp. 1,348–1,365.
doi: https://doi.org/10.1177/1461444812472322, accessed 1 August 2022.
Felipe T. Giuntini, Larissa P. Ruiz, Luzianne D. Kirchner, Denise A. Passarelli, Maria De Jesus Dutra Dos Reis, Andrew T. Campbell, and Jó Ueyama, 2019. “How do I feel? Identifying emotional expressions on Facebook reactions using clustering mechanism,” IEEE Access, volume 7, pp. 53,909–53,921.
doi: https://doi.org/10.1109/ACCESS.2019.2913136, accessed 1 August 2022.
Miao Guo and Fu-Shing Sun, 2020. “Like, comment, or share? Exploring the effects of local television news Facebook posts on news engagement,” Journal of Broadcasting & Electronic Media, volume 64, number 5, pp. 736–775.
doi: https://doi.org/10.1080/08838151.2020.1851125, accessed 1 August 2022.
Jessia Guynn, 2020. “Need a hug during the coronavirus pandemic? Facebook is giving you one to share,” USA Today (17 April), at https://www.usatoday.com/story/tech/2020/04/17/facebook-messenger-coronavirus-new-hug-emoji-reaction-support-family-friends/5147510002/, accessed 1 August 2022.
P. Sol Hart, Sedona Chinn, and Stuart Soroka, 2020. “Politicization and polarization in COVID-19 news coverage,” Science Communication, volume 42, no. 5, pp. 679–697.
doi: https://doi.org/10.1177/1075547020950735, accessed 1 August 2022.
Rebecca A. Hayes, Caleb T. Carr, and Donghee Y. Wohn, 2016. “One click, many meanings: Interpreting paralinguistic digital affordances in social media,” Journal of Broadcasting & Electronic Media, volume 60, number 1, pp. 171–187.
doi: https://doi.org/10.1080/08838151.2015.1127248, accessed 1 August 2022.
Raffael Heiss, Desiree Schmuck, and Jörg Matthes, 2019. “What drives interaction in political actors’ Facebook posts? Profile and content predictors of user engagement and political actors’ reactions,” Information, Communication, & Society, volume 22, number 10, pp. 1,497–1,513.
doi: https://doi.org/10.1080/1369118X.2018.1445273, accessed 1 August 2022.
Sanne Hille and Piet Bakker, 2014. “Engaging the social news user: Comments on news sites and Facebook,” Journalism Practice, volume 8, number 5, pp. 563–572.
doi: https://doi.org/10.1080/17512786.2014.899758, accessed 1 August 2022.
Michelle L. Holshue, Chas DeBolt, Scott Lindquist, Kathy H. Lofy, John Wiesman, Hollianne Bruce, Christopher Spitters, Keith Ericson, Sara Wilkerson, Ahmet Tural, George Diaz, Amanda Cohn, LeAnne Fox, Anita Patel, Susan I. Gerber, Lindsay Kim, Suxiang Tong, Xiaoyan Lu, Steve Lindstrom, Mark A. Pallansch, William C. Weldon, Holly M. Biggs, Timothy M. Uyeki, and Satish K. Pillai, 2020. “First case of 2019 novel coronavirus in the United States,” New England Journal of Medicine, volume 382, number 10 (5 March), pp. 929–936.
doi: https://doi.org/10.1056/NEJMoa2001191, accessed 1 August 2022.
Avery A. Holton and Hsiang I. Chyi, 2011. “News overload and the overloaded consumer,” Cyberpsychology, Behavior and Social Networking, volume 14, number 11, pp. 619–624.
doi: https://doi.org/10.1089/cyber.2011.0610, accessed 1 August 2022.
Alice Ju, Sun Ho Jeong, and Hsiang I. Chyi, 2014. “Will social media save newspapers? Examining the effectiveness of Facebook and Twitter as news platforms,” Journalism Practice, volume 8, number 1, pp. 1–17.
doi: https://doi.org/10.1080/17512786.2013.794022, accessed 1 August 2022.
Chensoo Kim and Sung-Un Yang, 2017. “Like, comment, and share on Facebook: How each behavior differs from the other,” Public Relations Review, volume 43, number 2, pp. 441–449.
doi: https://doi.org/10.1016/j.pubrev.2017.02.006, accessed 1 August 2022.
Ji won Kim, 2014. “Scan and click: The uses and gratifications of social recommendation systems,” Computers in Human Behavior, volume 33, pp. 184–191.
doi: https://doi.org/10.1016/j.chb.2014.01.028, accessed 1 August 2022.
Jan P. Kluck, Leonie Schaewitz, Nicole C. Krämer, 2019. “Doubters are more convincing than advocates. The impact of user comments and ratings on credibility perceptions of false news stories on social media,” Studies in Communication and Media, volume 8, number 4, pp. 446–470.
doi: https://doi.org/10.5771/2192-4007-2019-4-446, accessed 1 August 2022.
Emily A. Largent, Govind Persad, Samantha Sangenito, Aaron Glickman, Connor Boyle, and Ezekiel J. Emanuel, 2020. “US public attitudes toward COVID-19 vaccine mandates,” JAMA Network Open, volume 3, number 12, e2033324.
doi: https://doi.org/10.1001/jamanetworkopen.2020.33324, accessed 1 August 2022.
Anders Olaf Larsson, 2018. “Diversifying likes: Relating reactions to commenting and sharing on newspaper Facebook pages,” Journalism Practice, volume 12, number 3, pp. 326–343.
doi: https://doi.org/10.1080/17512786.2017.1285244, accessed 1 August 2022.
Carl A. Latkin, Lauren Dayton, Meghan Moran, Justin C. Strickl, and Karina Collins, 2021a. “Behavioral and psychosocial factors associated with COVID-19 skepticism in the United States,” Current Psychology (6 January).
doi: https://doi.org/10.1007/s12144-020-01211-3, accessed 1 August 2022.
Carl A. Latkin, Lauren Dayton, Grace Yi, Brian Colon, and Xiangrong Kong, 2021b. “Mask usage, social distancing, racial, and gender correlates of COVID-19 vaccine intentions among adults in the US,” PLoS ONE, volume 16, number 2, e0246970.
doi: https://doi.org/10.1371/journal.pone.0246970, accessed 1 August 2022.
Siyue Li, Kathryn D. Coduto, and Chi Song, 2020. “Comments vs. one-click reactions: Seeking and perceiving social support on social network sites,” Journal of Broadcasting & Electronic Media, volume 64, number 5, pp. 777–793.
doi: https://doi.org/10.1080/08838151.2020.1848181, accessed 1 August 2022.
George E. Marcus, W. Russell Neuman, and Michael MacKuen, 2000. Affective intelligence and political judgment. Chicago: University of Chicago Press.
Gina M. Masullo and Jiwon Kim, 2021. “Exploring ‘angry’ and ‘like’ reactions on uncivil Facebook comments that correct misinformation in the news,” Digital Journalism, volume 8, number 18, pp. 1,103–1,122.
doi: https://doi.org/10.1080/21670811.2020.1835512, accessed 1 August 2022.
Gina M. Masullo, Alexander L. Curry, Kelsey N. Whipple, and Caroline Murray, 2021. “The story behind the story: Examining transparency about the journalistic process and new outlet credibility,” Journalism Practice (11 January).
doi: https://doi.org/10.1080/17512786.2020.1870529, accessed 1 August 2022.
Miriam J. Metzger and Andrew J. Flanagin, 2013. “Credibility and trust of information in online environments: The use of cognitive heuristics,” Journal of Pragmatics, volume 59, part B, pp. 210–220.
doi: https://doi.org/10.1016/j.pragma.2013.07.012, accessed 1 August 2022.
Philip Meyer, 2004. The vanishing newspaper: Saving journalism in the information age. Columbia: University of Missouri Press.
Matt Motta, Dominik Stecula, and Christina Farhart, 2020. “How right-leaning media coverage of COVID-19 facilitated the spread of misinformation in the early stages of the pandemic in the U.S.,” Canadian Journal of Political Science/Revue Canadienne de Science Politique, volume 53, number 2, pp. 335–342.
doi: https://doi.org/10.1017/S0008423920000396, accessed 1 August 2022.
Teresa K. Naab, Dominique Heinbach, Marc Ziegele, and Marie-Theres Grasberger, 2020. “Comments and credibility: How critical user comments decrease perceived news article credibility,” Journalism Studies, volume 21, number 6, pp. 783–801.
doi: https://doi.org/10.1080/1461670X.2020.1724181, accessed 1 August 2022.
Nic Newman, with Richard Fletcher, Anne Schulz, Simge Andi, and Rasmus K. Nielsen, 2020. “Reuters Institute digital news report 2020,” Reuters Institute, University of Oxford, at https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2020-06/DNR_2020_FINAL.pdf, accessed 1 August 2022.
Seth M. Noar and Lucinda Austin, 2020. “(Mis)communicating about COVID-19: Insights from health and crisis communication,” Health Communication, volume 35, number 14, pp. 1,735–1,739.
doi: https://doi.org/10.1080/10410236.2020.1838093, accessed 1 August 2022.
Anne Oeldorf-Hirsch and S. Shyman Sundar, 2015. “Posting, commenting, and tagging: Effects of sharing news stories on Facebook,” Computers in Human Behavior, volume 44, pp. 240–249.
doi: https://doi.org/10.1016/j.chb.2014.11.024, accessed 1 August 2022.
Marie Ozanne, Ana Cueva Navas, Anna S. Mattila, and Hubert B. Van Hoof, 2017. “An Investigation into Facebook ‘liking’ behavior and an exploratory study,” Social Media + Society (10 May).
doi: https://doi.org/10.1177/2056305117706785, accessed 1 August 2022.
Cynthia Peacock, Gina M. Masullo, and Natalie J. Stroud, 2022. “The effect of news labels on perceived credibility,” Journalism, volume 23, number 2, pp. 301–319.
doi: https://doi.org/10.1177/1464884920971522, accessed 1 August 2022.
Pablo Porten-Cheé, Jörg Haßler, Pable Jost, Christiane Eilders, and Marcus Maurer, 2018. “Popularity cues in online media: Theoretical and methodological perspectives,” Studies in Communication and Media, volume 7, number 2, pp. 208–230.
doi: https://doi.org/10.5771/2192-4007-2018-2-80, accessed 1 August 2022.
Fabian Prochazka, Patrick Weber, and Wolfgang Schweiger, 2018. “Effects of civility and reasoning in user comments on perceived journalistic quality,” Journalism Practice, volume 19, number 1, pp. 62–78.
doi: https://doi.org/10.1080/1461670X.2016.1161497, accessed 1 August 2022.
Joseph M. Reagle, 2015. Reading the comments: Likers, haters, and manipulators at the bottom of the Web. Cambridge, Mass.: MIT Press.
doi: https://doi.org/10.7551/mitpress/10116.001.0001, accessed 1 August 2022.
Daniel Riffe, Stephen Lacy, Brendan R. Watson, and Frederick Fico, 2019. Analyzing media messages: Using quantitative content analysis in research. Fourth edition. New York: Routledge.
doi: https://doi.org/10.4324/9780429464287, accessed 1 August 2022.
Susana Salgado and Giuliano Bobba, 2019. “News on events and social media: A comparative analysis of Facebook users’ reactions,” Journalism Studies, volume 20, number 15, pp. 2,258–2,276.
doi: https://doi.org/10.1080/1461670X.2019.1586566, accessed 1 August 2022.
Elia Shearer, 2020. “Local news is playing an important role for Americans during COVID-19 outbreak,” Pew Research Center (2 July), at https://www.pewresearch.org/fact-tank/2020/07/02/local-news-is-playing-an-important-role-for-americans-during-covid-19-outbreak/, accessed 1 August 2022.
Elia Shearer and Amy Mitchell, 2021. “News use across social media platforms in 2020: Facebook stands out as a regular source of news for about a third of Americans,” Pew Research Center (12 January), at https://www.journalism.org/2021/01/12/news-use-across-social-media-platforms-in-2020/, accessed 1 August 2022.
Erin Spottswood and Donghee Yvette Wohn, 2019. “Beyond the ‘like’: How people respond to negative posts on Facebook,” Journal of Broadcasting & Electronic Media, volume 63, number 2, pp. 250–267.
doi: https://doi.org/10.1080/08838151.2019.1622936, accessed 1 August 2022.
Leona Yi-Fan Su, Michael A. Xenos, Kathleen M. Rose, Christopher Wirz, Dietram A. Scheufele, and Dominique Brossard, 2018. “Uncivil and personal? Comparing patterns of incivility in comments on the Facebook pages of news outlets,” New Media & Society, volume 20, number 10, pp. 3,678–3,699.
doi: https://doi.org/10.1177/1461444818757205, accessed 1 August 2022.
S. Shyman Sundar, 2008. “The MAIN model: A heuristic approach to understanding technology effects on credibility,” In: Miriam J. Metzger and Andrew J. Flanagin (editors). Digital media, youth, and credibility. Cambridge, Mass.: MIT Press, pp. 72–100, and at https://www.issuelab.org/resources/875/875.pdf, accessed 1 August 2022.
S. Shyman Sundar, Silvia Knobloch-Westerwick, and Matthia R. Hastall, 2007. “News cues: Information scent and cognitive heuristics,” Journal of the American Society for Information Science and Technology, volume 58, number 3, pp. 366–378.
doi: https://doi.org/10.1002/asi.2051, accessed 1 August 2022.
Barbara G. Tabachnick and Linda S. Fidell, 2007. Using multivariate statistics. Fifth edition. New York: Pearson/Allyn & Bacon.
Edwin C. Tandoc, Jr. and Timothy Vos, 2016. “The journalist is marketing the news: Social media in the gatekeeping process,” Journalism Practice, volume 10, number 8, pp. 950–966.
doi: https://doi.org/10.1080/17512786.2015.1087811, accessed 1 August 2022.
Ori Tenenboim and Akiba A. Cohen, 2015. “What prompts users to click and comment: A longitudinal study of online news,” Journalism, volume 16, number 2, pp. 198–217.
doi: https://doi.org/10.1177/1464884913513996, accessed 1 August 2022.
Yi Tian, Thiago Galery, Giulio Dulcinati, Emilia Molimpakis, and Chao Sun, 2017. “Facebook sentiment: Reactions and emoji,” Proceedings of the Fifth International Workshop on Natural Language Processing for Social Media, pp. 11–16, at https://aclanthology.org/W17-1102/, accessed 1 August 2022.
Juliane Urban and Wolfgang Schweiger, 2014. “News quality from the recipients’ perspective: Investigating recipients’ ability to judge the normative quality of news,” Journalism Studies, volume 15, number 6, pp. 821–840.
doi: https://doi.org/10.1080/1461670X.2013.856670, accessed 1 August 2022.
Joseph E. Uscinski, Michelle Seelig, Stefan Wuchty, Adam M. Enders, John Funchion, Kamal Premaratne, Casey Klofstad, Caleb Everett, and Manohar Murthi, 2020. “Why do people believe COVID-19 conspiracy theories?” Misinformation Review (28 April), at https://misinforeview.hks.harvard.edu/article/author/manohar-murthi/, accessed 1 August 2022.
Cristian Vaccari and Augusto Valeriani, 2018. “Digital political talk and political participation: Comparing established and third wave democracies,” SAGE Open (26 June).
doi: https://doi.org/10.1177/2158244018784986, accessed 1 August 2022.
T. Franklin Waddell, 2020. “The authentic (and angry) audience: How comment authenticity and sentiment impact news evaluation,” Digital Journalism, volume 8, number 2, pp. 249–266.
doi: https://doi.org/10.1080/21670811.2018.1490656, accessed 1 August 2022.
Pamela Wisniewski, Karla Badillo-Urquiola, Zahra Ashtorab, and Jessica Vitak, 2020. “Happiness and fear: Using emotions as a lens to disentangle how users felt about the launch of Facebook reactions,” ACM Transactions on Social Computing, volume 3, number 4, article number 20, pp. 1–25.
doi: https://doi.org/10.1145/3414825, accessed 1 August 2022.
Donghee Yvette Wohn, Caleba Carr, and Rebecca A. Hayes, 2016. “How affective is a ‘like’? The effect of paralinguistic digital affordances on perceived social support,” Cyberpsychology Behavior, and Social Networking, volume 19, number 9, pp. 562–566.
doi: https://doi.org/10.1089/cyber.2016.0162, accessed 1 August 2022.
Received 14 June 2022; revised 18 July 2022; 1 August 2022.
This paper is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Facebook reactions as heuristics: Exploring relationships between reactions and commenting frequency on news about COVID-19
by Gina M. Masullo.
First Monday, Volume 27, Number 8 - 1 August 2022