FOR RELEASE NOVEMBER 16, 2018 BY Aaron Smith FOR MEDIA OR OTHER INQUIRIES: Aaron Smith,Associate Director Haley Nolan,Communications Assistant 202.419.4372 www.pewresearch.org RECOMMENDED CITATION Pew Research Center, November, 2018, "Public Attitudes Toward Computer Algorithms"
2 PEW RESEARCH CENTER www.pewresearch.orgAlgorithms are all around us, utilizing massive stores of data and complex analytics to make decisions with often significant impacts on humans. They recommend books and movies for us to read and watch, surface news stories they think we might find relevant, estimate the likelihood that a tumor is cancerous and predict whether someone might be a criminal or a worthwhile credit risk. But despite the growing presence of algorithms in many aspects of daily life, a Pew Research Center survey of U.S. adults finds that the public is frequently skeptical of these tools when used in various real-life situations. This skepticism spans several dimensions. At a broad level, 58% of Americans feel that computer programs will always reflect some level of human bias-although 40% think these programs can be designed in a way that is bias-free. And in various contexts, the public worries that these tools might violateprivacy, fail to capture the nuance of complex situations, or simply put the people they are evaluating in an unfair situation. Public perceptions of algorithmic decision-making are also often highly contextual. The survey shows that otherwise similar technologies can be viewed with support or suspicion dependingon the circumstances or on the tasks they are assigned to do. To gauge the opinions of everyday Americans on this relatively complex and technical subject, the survey presented respondents with four different scenarios in which computers make decisions by collecting and analyzing large quantities of public and private data. Each of these scenarios were based on real-world examples of algorithmic decision-making (see accompanying sidebar) and included: a personal finance score used to offer consumers deals or discounts; a criminal risk assessment of people up for parole; an automated resume screening Real-world examples of the scenarios in this survey All four of the concepts discussed in the survey are based on real-life applications of algorithmic decision-making and artificial intelligence (AI): Numerous firms now offer nontraditional credit scoresthat build their ratings using thousands of data points about customers' activities and behaviors, under the premise that "all data is credit data." States across the country usecriminal risk assessmentsto estimate the likelihood that someone convicted of a crime will reoffend in the future. Several multinational companies are currently using AI-based systemsduring job interviewsto evaluate the honesty, emotional state and overall personality of applicants. Computerized resume screening is a longstanding and common HR practice for eliminating candidates who do not meet the requirements for a job posting.
Uploaded by AmbassadorUniverseFlamingo38 on coursehero.com