I recently designed and ran a new masterclass – Making Automation & Self Service Work In Your Contact Centre. It was commissioned by Ann Marie Stagg and initially offered to her CCMA members. By the end of the day we were all shaken and stirred.
We concluded that Intelligent Assistants are not just a channel shift strategy. At the very least, they are a transformational opportunity to redefine customer engagement. In some cases they might even disrupt how certain services are bought and the strategic alliances that support them.
How so?
As you might recall in my preceding article on this topic, I lined up Virtual Assistants with Visual IVR and Interaction Analytics as the most catalytic technologies that contact centres ought to be adding onto their three year roadmap.
Had you already done this, you’d be the proud owner of an entirely new capability. By today’s standards, it would be a real head turning differentiator. If achieved before the end of the decade, it will still keep your brand in the digital race.
I promised more use cases and more explanation. So here’s your next opportunity to binge on Intelligent Assistants.
What’s Under The Bonnet?
Let’s start with some further explanation as context setting. When you look at some of the examples provided in this article, you might wonder what all the fuss is about. Surely this is nothing new? You’ve seen them, tried them and maybe been distinctly ‘meh’ about the experience. Well that’s because you are remembering the early prototypes. Evolution has now made them smart enough to turn ‘meh’ into ‘-)’.
For instance, when I remember first talking about voice recognition a decade ago, it was 60% effective out of the box. The rest was custom tuning for that particular audience’s vocabulary. Umpteen iterations of those language models now results in 90%+ out of the box recognition.
Listen to this presentation by John Romano, Director Of Performance and Planning and you will hear how Hyatt Hotel’s virtual assistant is 98% effective at recognising what customers are saying. Humans feel the real deal around 95% effectiveness.
If you watch long enough into his presentation also note how fluent and fast paced Hyatt’s virtual assistant sounds. It is generations away from the ‘oh so slow’, ping pong dialogue still found on many IVRs.
As a final point, maybe the most impressive in fact, is that Hyatt can use their Virtual Assistants to start a conversation with customers. The information they gather during their part of an interaction is then dynamically shared with live advisors as they take over to complete the conversation.
Talk time for these advisors has increased from 40%-50% as a result. The customer feedback says it’s a win win.
So my first point is that things have moved on. Secondly this is not about a single technology. It is worth repeating the basic recipe for a 2016 Virtual Assistant. Baked into most solutions will be an ecosystem of the following ingredients.
Natural Language Processing – Machine Learning – Semantic Search – Predictive Analytics
A virtual assistant’s effectiveness comes from a deep, ongoing interplay between these rapidly evolving technologies. Here are a few examples to show what I mean.
Is Voice Becoming A Born Again Interface?
Natural language allows us to use our everyday language to ask for what we want. These days it’s a powerful self service interface. Sitting in the top floor of Costas in Waterloo station recently, I was treated to an excellent example of this by Tony Ballardie, CEO of Capito Systems, a UK start up with deep links into the globally recognised centre of excellence for speech recognition at Cambridge University.
Being a small room, it was noisy. Tony spoke into his smartphone asking for the latest odds on Manchester United winning their next match. Not so good, at least for Manchester United. Very good though when compared against Google Now which was tasked with a similar question and had problems locking onto the request against the ambient background chatter.
Tony’s secret sauce was to send the request to his semantic cloud which did a far better job at tuning into the right conversation.
‘Semantic search’ is another key ingredient in the Virtual Assistant recipe. It adds the intelligence of context and user intent in order to deliver relevant results. Many of us will have experienced the opposite when searching on an organisation’s support page. If the smarts within the search capability is only using the user’s keywords, it is going to return every article in which the keyword(s) is mentioned and indexed.
Ever wondered why the ‘system is slow today’ while talking to a contact centre? Maybe it’s the same problem. The advisor is suffering an avalanche of irrelevant answers after questioning the in house knowledge system.
A Google search on the other hand feels so much smarter. This is because they are masters of the semantic search which aims to understand the context of a user’s question and hence their intent. This can factor in the location of the user, the time of day, what device they’re using and most importantly their search history. It can even include the answers that were selected by other users who recently made identical or similar searches.
So that’s what semantic search is about. It aims to serve up just the answer you were looking for. And this is why your organisation needs it. Your customers expect the same rapid and relevant service from you as Google already provides them. If not, then expect them to go elsewhere or escalate to more expensive live channels if the issue really matters to them.
This expectation for speed and convenience takes me back to the betting example Tony showed me. It was lightening quick as it trawled the betting site’s live database of everything you could possibly bet on.
Later on I had a real ‘I want one’ moment as we left Costas and Tony found both his train home and platform number with a single voice command. I use my train app all the time, but voice is so much easier than text based requests!
However please note, this is not a one size fits all point that I’m making here. Omni-channel service design, in my book at least, is all about understanding customer situations and the channel choices they need. Betting and trains both operate as real time services, so voice aces text in terms of speed.
On the other hand you might not want others to know your business when searching for underwear on a retail site. Text is the more discreet option. It all depends.
But beyond this important service design principle, it’s worth noting that voice is being extensively punted as a post keyboard interface. You can now talk to cars, TV’s and the list is growing fast.
An often quoted figure from a 2014 Google research paper revealed that 41% of adults and 55% of teens use voice search more than once a day.
The most popular topics on the wish list were to ‘find my keys/ the remote’ and ‘send me pizza’. This last customer need may have been the motivation around a well known use case from Domino Pizza. When I came across it a few years ago as an example of Intelligent Assistants, I initially found it counter intuitive. Domino had already achieved $4bn in global sales via digital channels. So why revert to voice when the website and mobile app were performing so well?
In the version of the use case I have, Domino explains why they introduced ‘Dom’, their virtual voice assistant. They explained that pizza is a ‘complicated, simple’ product and voice helps customers when ordering online.
Subsequent customer feedback validated their hunch that voice was faster, both perceived and in actual time savings. It was also seen as a simpler interface with fewer steps than touch. Finally, it was seen as being easier since the ordering process felt more natural and intuitive to consumers. So Dom became an embedded part of the online order process.
Getting Spooky Or Just Incredibly Smart?
What else is lurking in the Virtual Assistant soup?
Machine learning is a type of artificial intelligence (AI) that provides computers with the ability to learn without being explicitly programmed. Amelia is an example of this type of capability. Fifteen years in the making yet only introduced to the world at large in 2015. Chetan Dube is founder and CEO of New York City-based IPsoft and explains what they have built.
“Amelia learns with every transaction and builds a mind map on the fly. As more incidents come in, this mind map is rapidly building, just the way humans build their mind maps. Soon it represents the cumulative intellect of all the different [employees] who have been fielding these different calls.”
And she learns pretty quickly. In beta trials with one enterprise partner, she handled fewer than 10% of all incoming calls during the first week. By the end of the first month, that number had jumped to 42%.
By the close of the second month, Amelia was successfully handling 61% of incoming requests.
Her rapid progress is based on understanding how her human counterparts answer her escalated questions and then storing them ready for use next time.
Of course she is not alone. The ubiquitous IBM’s Watson is also getting in on the customer service act and fits into the value proposition you are hopefully becoming familiar with.
“Engagement Advisor learns with every human interaction and grows its collection of knowledge.”
As I said earlier, all these core technologies operate as a self service ecosystem, tightly interacting with each other. They are not yet fully conversational. But they can understand, provide relevant answers and either learn from those interactions or surface knowledge gaps that need plugging.
Sometimes they are voice, sometimes a text based search box. Sometimes as an avatar that even seeks to make an emotional connection as claimed by Amelia’s inventors.
The buzz towards the end of 2015 was that AI as an umbrella term for many of these technologies is about to go where none of us have gone before. It’s fair to say that opinion seems divided between being nervous and excited at that prospect.
Next Time
In my next post, I’ll show you the level of first time resolution that is possible with these systems and how certain customer journeys can be entirely entrusted to an Intelligent Assistant. Never forgetting of course the design mandate of always providing an obvious escalation path to live assistance!
It might also interest you to know that Intelligent Assistants have made successful inroads in reducing live chat volumes. This might offer a phase two for many organisations who have made chat a centre piece of their digital engagement strategy. So keep an eye out for those use cases.
Beyond that it seems we can also expect these digital assistants to become even more proactive in making our lives easier. I’ve an interesting use case in saving energy costs for consumers that nicely illustrates this capability.
Doesn’t it makes you wonder what the balance between live assistance and self service will look like one, three and five years from now? Right now I’m prepared to bet that emotional or complex topics remain exclusively in the care of human assistants. But the rest is all opportunity for Intelligent Assistants.
BTW
What I’m attempting to talk about in this series of posts is definitely flavour of the month. Many new vendors have appeared as a result. And keep turning up. Importantly they are not the usual names seen at conferences or indeed within your contact centre(s).
No doubt some will be acquired as the incumbents bid to remain front of mind and relevant. But there is a genuine need to get educated.
As you have probably gathered I’m a growing fan of what Intelligent Assistants can achieve within a well conceived omni-channel strategy. So much so that I’m running a masterclass on the topic which is a great way to get up to speed.
Maybe see you at the next one.
Richard Snow
Interesting stuff. I am also coming around to believe that virtual assistants in many shapes and forms are beginning to transform customer engagement. The trick for me is as well as getting the voice recognition right, is “programming” these things to do what customers want; something agents are often not allowed to do. The latest and greatest I have seen is voice activated interactive videos which in my mind take virtual assistants to a new level.
Martin Hill-Wilson
Yes I think they are getting there. Would love to know the specifics of the example you mention. I think that the issue of their behaviour should be as simple as ‘find and answer’, or ‘escalate and learn’. I see no reason to push a customer out of the system as in some IVR workflows.
The folk developing Amelia talk about basic empathy capability in relation to where a customer is on certain journeys. I think that is more aspirational right now, but represents another step forward.
I think the big deal remains the lack of conversational fluency. It’s still one idea at a time, albeit expressed in ‘natural’ language. But all that said, I think the best in class are good enough right now to be used for larger chunks of service requests that were previously delivered live
Wally Brill
Martin,
Great post. Of course the common call center tasks can be handled by Intelligent Assistants. But the real benefit is when the IA actually performs as your agent in the real world. And I mean as YOUR agent. Booking flights, finding the best energy deal and switching you to it, finding products you’re looking for at the best price and purchasing them with your permission, helping you maintain a healthy lifestyle. etc., etc., etc…
The analogue has to be the real live, human personal assistant who knows your preferences and can anticipate your needs 24/7. Ideally, Tony’s Intelligent Assistant would have proactively provided him with the train information because it knew he was in London, had a ticket or often took that train.
My concern is that once again we’re building service silos without connecting the enterprises, goods and services that provide the customer with the end to end outcome they’re looking for. I don’t wake up and think “Gee I want an air ticket. I think I’ll call the airline.” No, my goal is to re-enact the beach scene in From Here To Eternity with my wife in Hawaii and then have a frosty margarita served to me (with a little umbrella in it).
A long winded way to say that my intelligent assistant will connect all the dots from buying the tickets to ordering the car to take us to the airport, to getting us the kind of hotel room we like etc. with minimum intervention from me.
Have a look at Frank: http://www.mywave.me and let’s get together at the IAC in London. It’ll be great to finally meet you in person so we can have a drink and talk about data privacy!
Martin Hill-Wilson
Wally,
What can I say? Many thanks for the comments. Could not agree with you more and you must have prophesied part three in this series of posts in part designed to drum up interest in the conference. My voice activated, dictation touting digital editor was scheduled next week to regale the final phase of IA maturity with tales of Frank and full blooded AI pixie dust.
You can be comforted that Frank, privacy by design and the whole nine yards already take centre stage in the associated masterclass I run which went down like a tequila slammer last week
Looking forward to becoming acquainted at conference.
Msrtin