Gamifying empathy with an emotional chatbot

G

ames, unlike simple quizzes or tests, put the learner in a position where they need to ‘transform’ to achieve the best outcome. And they’re encouraged to do this because overcoming failure is part of the fun.

In this post, I’m going to run you through an example of how we put this into practice, using gamification in business and learning, and some of the challenges we encountered along the way. As always, it’s through challenges where the most growth takes place. 

I’ll start off by setting the scene:

Client industry: Finance

Problem: Dealing with angry customers

Case study: Our HowToo Xperts designed an emotional chatbot to train staff in the finance industry in dealing with angry customers.

The results were both surprising and a lot of fun. 

What is a chatbot?

As defined by Wikipedia, a chatbot or chatterbot is a software application used to conduct an online chat conversation via text or text-to-speech, in lieu of providing direct contact with a live human agent.

As always, our HowToo Xperts are constantly searching for ways new technology can positively impact the working environment. In this example, we use gamification to match the user experience with specific learning outcomes.

How to gamify your eLearning

The aim of any gamification experience is to narrow the gap between the learning experience and the real world experience. 

Decide on your motives

We were approached by a client in the financial services industry who wanted to train new staff in how to deal with angry customers in response to an increase in verbal and physical attacks while on the job.

The learning outcomes included:

  • Safety procedures
  • Information on how to read aggressive body language
  • How to de-escalate a threatening situation.

This was an exciting opportunity for us to not just build and interesting piece of learning architecture, but to build a game that might transform the relationship between front line staff and their customers.

In this situation, both the person working at the financial institution and the customer are playing parallel ‘games’. They both have goals and might see the other person as their key obstacle to be overcome.

The motive: The staff member wants to de-escalate possible conflict and resolve the customer’s issue, and the customer wants their issue resolved. 

The problem: In each case the agenda is different.

The solution: Change the perspective.

In other words, we would present the staff member with a game where the apparent goal was to ‘solve’ a problematic customer, but the true goal was to help that customer achieve their own outcomes. And the only way that could happen was by transforming their perspective of the encounter. To do this, we would train them to empathise. 

With that in mind, it’s important to consider the obstacles.

Consider any problems you may encounter

Our proposal was to present the learner with a gamified simulation where they would have to de-escalate a realistic situation. We would present them with a virtual customer with an obvious financial problem. However, they would also have a hidden personal problem which was making them unreasonably angry.

The learner would have to choose between a number of preferable and less preferable conversation options that could bring them closer to resolving the financial issue and unlock new information about the personal issue which would provide context for their anger. 

Traditionally, this kind of scenario might be built with a ‘branching narrative‘, one of the key tools in an instructional designer’s arsenal.

A simpler cousin to interactive fiction games or the ‘choose your own adventure’ books, branching narratives allow the learner to choose between a few possible plans of action and see the consequences of those actions. Like this:

https://lh5.googleusercontent.com/hxRcawjZP9-6bNGASgn3ZD4U1K5MWFtJkWFlW7loSTnynnZX1_tZ9uFfG1IIo_A1ZIT3OyN1kuXyVeVgJBqn-MxZR2VWlpr2hBQQiWt7JKbKBzKv1moXiYQosl0feAgxvcH8R06yWyx1GRq4ro_9hDIP2rtbAWbA1AGmmsRw-AGv0t2FLyUULBGiMA

To make this manageable from a design perspective, they also usually have ‘choke points’: after one or two choices, the narrative will bring the learner back to the central path.

https://lh5.googleusercontent.com/9GYI7CIP2kQqApQtKm84dVYpcQ5VIcDSheGZU2iEN7VhK_ccbWLHWQ_bpebiXDj9JCpk8rk1NLjvZX7Ip48Z6tNsamVoDOQSM5ujOpMf-dzmFcQDIvUbABiTkuFT5rjDJpibnOrXQC7Nx567GNxYQEyWs-m2ObzRrZWMaooO3tdFDQRKTEKMfZX26A

But in real life, the effects of our choices ripple out, closing off some doors and opening others, affecting the way other human beings choose to interact with us in return. For example, if you picked ‘choice a’, screen 3 should be fundamentally different than if you picked ‘choice b’.

Choke points reduce the ‘customer’ to a plot device, by robbing the player’s bad choices of their lingering effects. We needed the virtual customer’s opinion of the player to persist, accumulating good will or anger depending on the choices.

On the other hand, accounting for every possible scenario would be inefficient and frustrating to build.

An AI or open world scenario wouldn’t be fit for our purpose either. We wanted to keep the number of choices contained for the same reason that we wanted to keep them open: all choices needed to be impactful, even the bad ones.

We used a fake AI to get our results faster

Meet Bob

Bob was a fake AI who could hold an organic conversation, but couldn’t actually drive it.

The important thing about Bob was that he had to exhibit a range of emotional states. He would need to respond convincingly to the learner’s actions, demonstrating his emotional state through visual and verbal cues. 

The learner would need to read these cues to understand whether their choices were escalating or de-escalating the confrontation, which meant context was as important as how he felt ‘right now’.

In other words, Bob’s level of aggression would persist through the entire simulation. He would be directly impacted by the learner’s choices and the order in which they made them. 

Every choice the learner made would carry weight.

Mistakes would be more than points on a screen, and they would have lasting effects that carried through the conversation.

Bob had to respond to questions on a number of topics, including himself, his account and his personal situation where it impacted on his problem with the institution. Those responses had to be modulated by his emotional state and the questions that had come before.

How we built an emotional chatbot

We knew that the learner’s success would depend on two factors: whether or not they could resolve Bob’s issues, and whether or not they had empathised with and calmed him down in the process.

To achieve this, we created an ‘aggression counter’ to track how angry (or calm) Bob got, on a scale of 1 to 10. 

We also built a learning string, which would be the learner’s (hidden) score. The client provided us with a list of positive and negative strategies for de-escalating conflict, so we assigned each one a letter.

The learner’s achievements, missed opportunities and poor choices would be presented in a granular feedback screen as an itemised list, and the string would help us keep track of them.

To make the conversation feel organic, our branching narratives were actually made up of free-floating blocks that could be popped into the conversation at almost any time. But unlike a traditional branching narrative, these blocks didn’t link to separate outcomes, they simply affected Bob’s aggression score and the learning string, as well as the language Bob used to phrase his response.

These responses were then scripted and recorded by a voice actor in five different emotional tones. The emotional states were also linked to a different actor, performing five different facial expressions giving Bob the functional sound and appearance of an angry customer.

What gamification taught us

These two sets of variables interacted in surprising ways. Being helpful and practical with Bob would make him calmer, but not as calm as if you were empathetic and could demonstrate you were listening. Similarly, being officious and obstructive would make Bob angry, but being rude or showing a lack of empathy would make him even angrier.

One thing we hadn’t expected was the way the learner’s experience could be transformed by the context of the choices themselves.

For example, a learner could take a constructive path through the conversation, resolve Bob’s issue and leave him feeling calm and reassured.

Or, the learner could choose those same options in a different order and come across as officious and functional, solving the customer’s financial issue while ignoring their feelings.

Alternatively, the learner could constantly tease Bob about his predicament, then guilt him into calming down, before goading him again. This would usually end with Bob ending the conversation, but could end with him remaining angry despite his problem being resolved.

All this with the same set of 12-15 questions.

Is gamification right for you?

By providing just enough material for the customer to feel organic, we created an effective solution for empathy training.

Bob’s responses would take on new meaning depending on the context and attitude that the learner brought to the situation. They would fill in the gaps of Bob’s behaviour by intuiting how their choices had inspired these changes.

And in recognising their impact on the character, eventually ‘winning’ the game, the learner would engage in a transformative understanding of their own role in an interpersonal conflict.

Interested in adding gamification to your next eLearning project? Reach out to our Xperts today.

Posted 
Nov 17, 2022
 in 
Learning Design
 category

More from 

Learning Design

 category

View All

Get the latest posts to your inbox every month!

No spam ever. Read our Privacy Policy
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.