Mitigating the Impacts of Disinformation and/or Misinformation on US Social Media
Platforms
Summary
Disinformation and misinformation on social media platforms are significant problems that pose
various threats to individuals and society at large. This research paper identifies three main causes
of the proliferation of false information on US social media platforms, which include political
polarization, profit-driven algorithms, and human error and bias. The impacts of disinformation and
misinformation on these platforms include undermining democratic processes, fueling social unrest,
damaging public health, and decreasing trust in institutions and media.
To mitigate the impact of disinformation and/or misinformation on US social media platforms, the
paper identifies two options. The first option is self-regulation by social media companies, where
incentives are provided for social media companies to take responsibility for the content shared on
their platforms. The second option is government regulation of social media companies, which
mandates certain standards for social media companies to adhere to in terms of the content that is
shared on their platforms.
Both options have rewards and risks associated with them. The rewards of self-regulation include
increased public trust in social media platforms and a reduction in the spread of false information.
The risks include the potential for social media companies to abuse their power and suppress
information they do not agree with. The rewards of government regulation include increased public
trust in social media platforms and potentially a reduction in the spread of false information. The
risks include concerns about censorship and potential violations of First Amendment rights.