Research

Robots at the reporting desk

Penn State researchers explore how people feel about robots writing the news

Credit: Adapted from Flickr user 62693815@N03All Rights Reserved.

If you’ve checked out an online news site lately, there’s a good chance at least one of the stories you’ve read was written by a robot. The Associated Press — the world’s biggest news organization — churns out almost 5,000 robot-written stories per quarter, and Forbes uses robots to write many of their company earnings reports.

While it’s a fact that robots are now writing at least some of the news, what’s still a mystery is how consumers feel about these new robo-writers.

The question inspired Andrew Gambino, a University fellow and doctoral candidate in the College of Communications, to embark on a new study alongside S. Shyam Sundar, distinguished professor and co-director of Penn State’s Media Effects Research Lab, and fellow doctoral candidate Jinyoung Kim.

“I started hearing about robot-written articles a couple years ago, and as someone who appreciates writing, I was generally curious about how well these algorithms could write,” said Gambino. “But also, my main focus as a researcher is the psychological relationship between artificial intelligence and humans. So I wanted to explore how much people like and trust these articles written by robots.”

Gambino’s research group presented 435 participants with an article on one of three subjects: health, finance or politics. Although all three articles were generated by a robot, half the participants were told they were written by a human journalist.

Additionally, half the participants were told their assigned article was from the New York Times, while the other half was told it came from the National Enquirer.

After the participants read and were asked a series of questions about the article they were assigned, Gambino’s group found that while the robot was actually preferred for the financial articles, the human writer was preferred for the health articles. (There was no preference for the political articles.)

Gambino says he was somewhat surprised by the results. While there were differences in how people felt about the robot-written financial article versus the health article, Gambino had thought they would be viewed the same.

“It seems that we might not be as comfortable with robots delivering news related to health,” Gambino said. “We suspect that this was because of an ‘eeriness’ or a creepy feeling the participants felt, and our results backed this up.”

This eeriness is a concept often referred to as the “Uncanny Valley.” The idea, defined in 1970 by roboticist Masahiro Mori, is that while humans generally like robots that look more like humans — they might prefer a robot with a face, for example — they start to feel uneasy and a little creeped out when a robot becomes a little too human. (When a robot walks similarly to a human, for example, but has exaggerated knee movements.)

Although “eeriness” and “creepiness” seem like ephemeral concepts, the researchers did find a way to measure how uneasy the participants felt about a robot writing their news. The participants were asked how much they agreed with statements like “the prose in the story seemed natural” and “the prose in the story seemed spooky.”

Participants who thought they were reading robot-written stories tended to display higher levels of eeriness, and therefore trusted the article less and thought it to be lower quality.

"Eeriness happens when people feel like something a robot does is too weirdly human,” said Sundar. “This raises the issue of potential privacy concerns with machines taking over human roles. While people might trust the abilities of the machine, I'm not sure if they trust its ability to keep things private or secure. In other words, discretion."

The research group also found that even though the articles were the same text, participants trusted those labeled as being from the New York Times more than the ones they were told came from the National Enquirer.

“That's called the branding effect, which we find in traditional journalism and media effects literature," said Gambino. "However, it was still surprising to see that even when we're talking about a robot, we still care about its image or brand. It's like, 'We don't trust those bad robots, but we'll trust these good robots.'"

Sundar says the study is just one in a series of many that are exploring the psychological effects of “machine agency” — the idea that machines are becoming more independent and autonomous.

“We’re entering an era where a lot of machines are attaining a status that previously was so sacredly human,” said Sundar. “Our larger project is exploring what it means for machines to be their own independent agents, whether they give us information, get work done behind the scenes or personalize things.”

With robots becoming more interactive, Sundar has looked at who people blame when a robot makes a mistake: themselves or the machine. When interacting with a robot like a GPS system, both the user and robot make decisions. The user inputs certain information, and the robot gives decisions accordingly. Sundar found that people tend to trust the machine’s decisions and blame themselves when something goes wrong.

As people start to trust machines more, Sundar says they begin to be more comfortable with robots doing things that were previously reserved for humans.

“I feel like there is a growing acceptance toward machines doing things that previously only humans could do,” said Sundar. “In journalism, this might be because we’re aware of human biases and don’t want our news skewed, so we actually prefer a machine.”

Although this can sound scary to reporters, Gambino says it’s valuable information for journalists to have, whether they are beginning their careers or looking to improve their skillsets.

“Journalists can bring skills to the table that machines cannot, like interviewing and writing more creative pieces,” said Gambino. “Hopefully, these algorithms will free up writers to do the work they want to do and are passionate about, instead of the work that’s more tedious.”

Although more creative skills, like poetry- and novel-writing, are still considered generally safe from robot takeover, some researchers are now saying machines will definitely be able to write creative fiction — and soon. Gambino, though, remains optimistic that true art will always need a human’s touch.

“Shyam and I both have deep-seated appreciations for art, and I feel like there’s something inherently human about creating good art,” Gambino said.

He paused.

“But I’ll also be first in line to read that first machine-written novel when it comes out.”

For more IT stories at Penn State, visit http://news.it.psu.edu.

Last Updated July 28, 2017

Contact