New research shows people are more likely to rely on algorithms

Despite growing concern over the intrusion of algorithms into everyday life, people may be more willing to trust a computer program than their peers, especially if a task becomes too difficult, according to new research by researchers. data scientists from the University of Georgia.

From choosing the next song on your playlist to choosing the right size pants, people rely more on the advice of algorithms to help them make daily decisions and streamline their lives.

“Algorithms are capable of performing a large number of tasks, and the number of tasks they are capable of performing is increasing almost every day,” said Eric Bogert, Ph.D. student in the information systems department. of Management from the Terry College of Business. “There seems to be a bias in favor of relying more on algorithms, as a task becomes more difficult and this effect is stronger than the bias of relying on other people’s advice.”

Bogert worked with professor of management information systems Rick Watson and assistant professor Aaron Schecter on the paper, “Humans rely more on algorithms as social influence as a task gets harder,” which was published April 13 in Nature’s Scientific reports newspaper.

Their study, which involved 1,500 people reviewing photographs, is part of a larger body of work analyzing how and when people work with algorithms to process information and make decisions.

For this study, the team asked volunteers to count the number of people in a photo of a crowd and provided suggestions generated by a group of other people and suggestions generated by an algorithm.

As the number of people in the photo increased, counting became more difficult and people were more likely to follow the suggestion generated by an algorithm rather than counting themselves or following the “wisdom of the crowd, ”Schecter said.

Schecter explained that choosing to count as a trial task was important because the number of people in the photo makes the task objectively more difficult as it increases. It is also the type of task that laymen expect computers to perform well.

“It’s a task that people perceive a computer to be good at, although it may be more prone to bias than counting objects,” Schecter said. “One of the common problems with AI is when it is used to grant credit or approve a person for loans. While this is a subjective decision, there are a lot of numbers in there – like income. and credit rating – so people feel like it’s a good job for an algorithm. But we know that addiction leads to discriminatory practices in many cases due to social factors that are not taken into account. “

Facial recognition and hiring algorithms have also come under intense scrutiny in recent years, as their use has revealed cultural biases in the way they were constructed, which can lead to inaccuracies in the process. matching faces with identities or selecting qualified candidates, Schecter said.

These biases may not be present in a simple task like counting, but their presence in other trusted algorithms is one reason why it is important to understand how people rely on algorithms when making decisions. , he added.

This study was part of Schecter’s larger human-machine collaboration research program, which is funded by a $ 300,000 grant from the US Army Research Office.

“The end goal is to look at groups of humans and machines that make decisions and find out how we can get them to trust each other and how that changes their behavior,” Schecter said. “Because there is very little research in this context, we start with the fundamentals.”

Schecter, Watson, and Bogert are currently studying how people rely on algorithms when making creative judgments and moral judgments, such as writing descriptive passages and releasing prisoners on bail.

Source of the story:

Material provided by University of Georgia. Original written by J. Merritt Melancon. Note: Content can be changed for style and length.

Agriculture Lifestyle political