Research

Penn State team develops simulator to train teachers to deal with bullies

The simulator prepares teachers to interact with bullied students in an effective way. Credit: U.S. Department of EducationAll Rights Reserved.

A 12-year-old boy named Alex sits on a school bus, trying desperately to shield himself from the punches thrown at him by a kid in the next seat. The blows come after his tormentors already threatened to break his bones and scared off the few friends he did have.

Alex’s crime? A wide nose, mouth that droops down at the corners and perhaps a little social awkwardness. It’s a scene in the documentary “Bully,” and one that’s often repeated in schools across America.

Bullying has become an epidemic. According to the American Society for the Positive Care of Children, one in three children report being bullied at school. But Jennifer Frank, a Penn State assistant professor, wants to help change that with a new artificial intelligence (AI) simulator.

One of the keys to stopping bullying, she said, is properly preparing teachers to interact with bullied students in an effective way — a skill that doesn’t necessarily come naturally to everyone.

Frank, along with her co-principal investigator Deb Schussler, has created an AI simulation that will allow teachers to practice interacting with a bullying victim. The system, called a chatbot, is designed to resemble a real person chatting in a virtual setting and gives teachers practice without the pressure and high stakes of a real-life conversation.

As an assistant professor in Penn State’s College of Education, Frank said part of her job is to train teachers to work with people. Often, that training includes having her students practice these interactions through role-playing. But Frank said it’s not always an effective tactic.

“Not everyone volunteers or gets the chance to practice through role-playing, and you can’t go back and watch the session afterward,” Frank said. “But if you’re on a computer practicing with an AI simulation, you can really slow down and think about your response if a child comes up to you and says, ‘I’ve been bullied. What do I do?’”

Frank said the slowing down part is key. When people are confronted by a situation they don’t know how to deal with, they are most likely to respond with the first thing that comes to mind or to freeze.

“We often don’t know how to respond to many of the most important situations in life, but the way you do respond can really stick with a kid for a lifetime,” said Frank. “If you’ve thought through and practiced these interactions ahead of time, you’re more likely to respond in a positive way. The AI bot doesn’t have any sense of time and allows you to slow it down and practice these scripts.”

To kick off the project, Frank and Schussler received grant money in January from Penn State’s Center for Online Innovation in Learning (COIL). They then enlisted the help of Penn State’s Teaching and Learning with Technology (TLT) to design the actual program and features.

Tsan-Kuang (T.K.) Lee, a lead developer in TLT, took the lead with designing and implementing the system’s architecture, streamlining the AI learning process, and running tests, among other things. The project required him to know four programming languages, improve a markup language called Artificial Intelligence Markup Language (AIML), use two types of servers and have knowledge of AI.

The program is designed to look like the user is having a conversation online or through text message with a bullying victim. A small photo of the “victim” appears in the upper-left corner with the chat dialogue appearing to the right. The user initiates the conversation by typing a greeting, and replies from the “victim” appear immediately after.

The interface is simple, clean and easy to use. Lee said that while there were features he wanted to add, the user experience team on the project — including Bevin Hernandez, Serena Epstein and Nick Rossi — convinced him otherwise.

“I had come up with a lot of really cool features, but they ultimately persuaded me not to use them,” said Lee. “If users think a system is too complicated, they’ll give up using it. A bit of geekiness is fun, but an overdose kills.”

After the program’s system and interface were built, it was time to create the AI component. To built the bot’s language, Frank brought on two research assistants: Emily Chukusky and Alex Callopy. The two used AIML to teach the bot how to reply when users type in their questions and responses.

Chukusky, a recent Penn State graduate, wanted the bot to be as smart as possible, so part of her job was brainstorming as many conversation options as possible. For example, Chukusky started by listing as many greetings as she could think of (hello, hi, hey, etc.) before moving on to the different ways people ask others how they’re doing.

Once she moved past simple greetings, Chukusky had to move on to building the language of someone who’s being bullied — something she wasn’t familiar with. To research, Chukusky watched documentaries on bullying and references Alex’s scenes in “Bully” as particularly startling.

“I grew up in a small town and wasn’t bullied myself, so it was a topic I wasn’t familiar with,” Chukusky said. “Watching these documentaries and seeing the problems school districts face today was very eye opening. It made me realize that teachers really need to know how to handle these situations and that our project could help.”

Armed with new insights, Chukusky finished programming the system’s vocabulary. She also realized partway through that she also had to provide responses to negative phrases the user might type. Someone asking “What’s wrong with you” instead of “What’s wrong,” for example, could be taken as a negative response.

Now, after everyone’s hard work and collaboration, the program has reached its prototype stage and is being tested by Penn State pre-service teachers (those in undergraduate programs before they begin student teaching). Additionally, in February the program will be presented at the American Association of Colleges for Teacher Education conference and the Pennsylvania Association for Middle Level Education conference.

While Lee said it’s gratifying to see the program he helped build turn into a finished product, he also says he was motivated by trying to bring a greater good to the world.

“I believe that teaching kids not to bully can lead to a better world where people respect each other more. Stronger kids bully shorter kids, big countries bully smaller countries and human beings bully the environment,” Lee said. “It comes down to a choice between using power or using respect, and we are helping teachers help kids make the right choice.”

For more IT stories at Penn State, visit http://news.it.psu.edu.

Last Updated November 24, 2014

Contact