Readings Newsletter
Become a Readings Member to make your shopping experience even easier.
Sign in or sign up for free!
You’re not far away from qualifying for FREE standard shipping within Australia
You’ve qualified for FREE standard shipping within Australia
The cart is loading…
While many solutions have been proposed to combat misinformation on social media, most are either ineffective, expensive, or do not work at scale. What if social media users could help mitigate the misinformation they're also responsible for proliferating?
In Observed Correction, Leticia Bode and Emily K. Vraga consider both the power of and the barriers to "observed correction"--users witnessing other users correct misinformation on social media. Bode and Vraga argue that when people view others directly and publicly correct misinformation on social media, their understanding of the topic becomes more accurate. Yet, while many members of the public value correction, Bode and Vraga find that they are often reluctant to correct misinformation they see on social media. This same reluctance to correct is seen among expert fact checkers and health communicators, compounded by the constraints of limited resources and competing priorities. To empower people to respond to misinformation, Bode and Vraga offer a set of practical recommendations for how observational correction can be implemented. In some cases, simple messages addressing concerns can increase users' willingness to respond to misinformation. In other cases, they argue that platforms will need to promote corrections and protect the correctors while experts can contribute by creating accessible curated evidence (ACE) to facilitate user corrections and build social norms around responding to misinformation.
Including analysis of eleven experiments, seven surveys, and dozens of interviews with social media users, health professionals, fact checkers, and platform employees about their efforts to curb misinformation online, Bode and Vraga make the case that observed correction is an effective and scalable tool in the fight against bad content on the Internet.
$9.00 standard shipping within Australia
FREE standard shipping within Australia for orders over $100.00
Express & International shipping calculated at checkout
While many solutions have been proposed to combat misinformation on social media, most are either ineffective, expensive, or do not work at scale. What if social media users could help mitigate the misinformation they're also responsible for proliferating?
In Observed Correction, Leticia Bode and Emily K. Vraga consider both the power of and the barriers to "observed correction"--users witnessing other users correct misinformation on social media. Bode and Vraga argue that when people view others directly and publicly correct misinformation on social media, their understanding of the topic becomes more accurate. Yet, while many members of the public value correction, Bode and Vraga find that they are often reluctant to correct misinformation they see on social media. This same reluctance to correct is seen among expert fact checkers and health communicators, compounded by the constraints of limited resources and competing priorities. To empower people to respond to misinformation, Bode and Vraga offer a set of practical recommendations for how observational correction can be implemented. In some cases, simple messages addressing concerns can increase users' willingness to respond to misinformation. In other cases, they argue that platforms will need to promote corrections and protect the correctors while experts can contribute by creating accessible curated evidence (ACE) to facilitate user corrections and build social norms around responding to misinformation.
Including analysis of eleven experiments, seven surveys, and dozens of interviews with social media users, health professionals, fact checkers, and platform employees about their efforts to curb misinformation online, Bode and Vraga make the case that observed correction is an effective and scalable tool in the fight against bad content on the Internet.