Research

Certain interactive tools click with web users

The researchers examined how people interacted with content using several web navigation tools, including clicking, sliding, zooming, hovering, dragging and flipping, along with combinations of those tools. Credit: © iStock Photo pablocalvogAll Rights Reserved.

SEOUL, South Korea -- Before web developers add the newest bells and the latest whistles to their website designs, a team of researchers suggests they zoom in on the tools that click with the right users and for the right tasks.

"When designers create sites, they have to make decisions on what tools and features they use and where they put them, which takes a lot of planning," said S. Shyam Sundar, Distinguished Professor of Communications and co-director of the Media Effects Research Laboratory. "You not only have to plan where the feature will be, you also have to design what will go underneath that layer, then create the content for it, so we wanted to know if these new, more sophisticated ways of interacting with a site are actually better than just clicking."

The researchers, who presented their findings Tuesday (April 21) at the Computer Human Interaction conference in Seoul, South Korea, suggest that interactive tools can not only affect how people use a website, but also how they feel about the site, what they think about its content and what information they retain after they use it.

In a series of studies, the researchers examined how people interacted with content using several web navigation tools, including clicking, sliding, zooming, hovering, dragging and flipping, along with combinations of those tools, according to Sundar. They also measured how much information they retained during the sessions as a way to test how absorbed the users were during the task.

Participants indicated that the slider, which allowed them to scroll along a timeline to view images and text about a historical event, was better at aiding memory than other tools, including a more recent navigational innovation, the 3-D carousel, which allows users to rotate images.           

Sundar said that users spent more time on the carousel and interacted frequently with the tool, but that the number of interactions and length of time on the tool did not necessarily mean they found the carousel mentally engaging. Looks do not lead to better usability, he added.

"The 3-D carousel looks attractive, but in terms of encoding information, it was not effective," said Sundar.

This discrepancy between the high level of interaction and low level of satisfaction may also mean that a commonly used metric -- how long a person has remained on a site -- does not necessarily suggest a positive user experience.

"We used to think that the more time a user spent on a page or feature, or how 'sticky' it is, was a good thing and that it meant they were more interested in the page," said Sundar. "However, it could also mean they are confused and having trouble navigating."

Clicking, one of the web's early navigational tools, continues to be a popular choice for users, according to the researchers, who also found that the level of a user's web experience influenced the effect tools had on users' attitudes toward site content. For example, expert-level web users liked the content more and thought it was more credible when the site used simple clicking and mouse-over tools compared to less intuitive tools like the 3-D carousel and drag. The reverse is true for those who have limited expertise with technology.

"These techniques may be less natural to use, but they are seen as fancy by lay users," said Sundar. "They have a 'halo effect' on content."

Regardless of the differences across users, finding that these interactive tools can shape how users think and feel about media content is an important discovery, he added.

In the first study, the researchers recruited 128 college students and assigned them one of 20 different websites that were designed to test the interaction techniques. The content was the same on all the websites. The researchers then recruited 127 college students for a study that examined a combination of website tools. These participants were assigned one of six different website versions designed to test their reaction to those combinations.

Sundar worked with Saraswathi Bellur, assistant professor of communication, University of Connecticut, Qian Xu, assistant professor of communications, Elon University and Haiyan Jia, post-doctoral scholar in information sciences and technology, of Penn State, Jeeyun Oh, assistant professor of communications, Robert Morris University, all former Penn State doctoral students in mass communications.

Last Updated July 28, 2017

Contacts