Paul Sweeney �Co-Founder / Chief Strategy Officer
What’s changed and how it matters
From Communications to �Conversations
Everybody
Has a Plan
Until they get punched in the face
Mike Tyson
Conversational AI �is directionally correct….
LLM’s are great �disambiguators and solve the discovery problem
�
Moving from connecting channels to connecting to what happens in channels.
���
�
Access to broad and narrow 3rd party services via plug ins will be in layers, from selected, to approved, to called up, to created on the fly��
The search query becomes �the prompt chain
��
�
Text-first experience today, �but then multi modal��
�
The intelligent assistant becomes the new UX
Well, yes. Kinda.
�
�
LLM
NLU
DM
Keeping Conversations on the Rails...
Custom LLMs in Context �Gets Complicated
LLM
DM
Channels
Knowledge Bases
NLU
API
UI
Storage
Vector DB
Models
Model Mgt
Micro/ services
MLOPs
LLMOPs
Creator Tools
Admin
Security
Audit
Reports
The Language Model Stack
David R. Oliver https://medium.com/future-architecture/the-language-model-stack-d38c8de880ec
What We Learned �From Above ….
Information is matched; interactions enabled; transactions made, all orchestrated in the one conversation flow. ��This is how value gets released. But you need it all. | The LLM’s will be open source, customized, trained. That takes time and effort. Not one LLM, but multiple LM’s being trained. That requires LLMOPs. | Model accuracy, intent fit, outcomes all have to be assured and controlled for. Custom data, values, numbers etc. | All this has to be performative, low latency, and super low cost. Scale a suboptimal architecture and see what happens to your AWS bill. |
Conversational AI Today….
Most companies don’t have access to the API to bring in the data | Evolving use of chain of thought, chain of verification, chain of… prompt engineering. ��Results still vary. Might still get performed by future versions of the LLM (GPT6) | Co-pilots are the UI, the experiences still must be designed. Chat is customer co-pilot, agent assist is the employee assist. ��Still, pretty generic. Work is deeply contextual. | Copy generation, smart replies, summarization, conversation labelling for intent-based routing are features here today. But not yet “transformative”. | ChatGPT Enterprise v Custom LLM Models �is TBD. Enterprises very concerned with integrity of own data, and value of own data. |
Conversational AI Further Questions….
Voice Interaction is still input, instruction based. If that. Keep an eye on Voice interaction on ChatGPT Mobile App. Behavior will follow engagement.
Conversational AI Further Questions….
IVR versus voice assistant on a website? ��Sounds great but what's the use case? ��No evidence its driving down inbound calls for instance
Conversational AI Further Questions….
10x conversational data in voice: performance for one, then more biometric type data. ��Yet, no permissions, ethical AI rules.
Conversational AI Further Questions….
Voice cloning, voice ID, avatar experiences. �Sure, for entertainment. ��Might be more of a fraud and security issue today.
Conversational AI Further Questions….
Conversational and metaverse would be synergistic, but nobody gets customers service in the metaverse IMHO.
CPaaS Provocations
CPaaS Provocations
Personal assistant co-pilot �as future customer interface
���
�
CPaaS Provocations
Communications data, and �knowledge base data, �are both required to facilitate deeply personalized customer conversations�
CPaaS Provocations
Deep data driven insights into “everything” that could matter
��
�
CPaaS Provocations
Need for customer intimacy �to deliver all the above.
��
�
CPaaS Provocations
Communications data, and �knowledge base data, are both required to facilitate deeply personalized customer conversations�
Personal assistant co-pilot �as future customer interface
���
�
Technology �stack performance at scale �for real time processing��
�
Deep data driven insights into “everything” that could matter
��
�
CPaaS Provocations
Vertical have own �workflows, data, integrations, and models
�
�
CPaaS Provocations
Where do you partner, �with who, and for what reason?
�
�
Insider news…
SuperPrompt≈
TAKEAWAYS
Customer service getting disrupted
1
Internal enterprise �silos remain
3
Conversations contain events and data
2
paul.sweeney@webio.com
Thank you
Learnings from the past year
The rise of �digital collections
1
Internal enterprise �silos remain
3
Automation works and it’s very powerful
2
CPaaS Provocations
Insider news…
Nobody gets automated out of their job. Unaddressed work pours in.
Focus on JTBD and see how LLM’s etc. can be used.
The LLM’s are getting charged for storage. There is a lesson in that.
LLM driven services need zero latency. There is a lesson in that.
Voice LLM’s are hard. They won’t always be. Then they will very profitable.
Automate entire processes to deliver solid value proposition.
SuperPrompt≈
�Take aways���Customer service getting disrupted�Conversations contain events and data�Co-pilots for everyone, even you. �
END Original Slides
"From communications to conversations – �what’s changed and how it matters”�
Conversational AI is directionally correct….
So, how to keep conversations on the rails..
LLM
DM
NLU
Custom LLM’s in context gets complicated
LLM
DM
Channels
Knowledge Bases
NLU
API
UI
Storage
Vector DB
Models
Model Mgt
Micro/ services
MLOPs
LLMOPs
Creator Tools
Admin
Security
Audit
Reports
The Language Model Stack
David R. Oliver https://medium.com/future-architecture/the-language-model-stack-d38c8de880ec
What we learned from above…
Conversational AI Today
Conversational AI further questions
CPaaS Provocations
Insider news…
Nobody gets automated out of their job. Unaddressed work pours in.
Focus on JTBD and see how LLM’s etc. can be used.
The LLM’s are getting charged for storage. There is a lesson in that.
LLM driven services need zero latency. There is a lesson in that.
Voice LLM’s are hard. They won’t always be. Then they will very profitable.
Automate entire processes to deliver solid value proposition.
SuperPrompt≈
�Take aways���Customer service getting disrupted�Conversations contain events and data�Co-pilots for everyone, even you. �
What We Learned �From Above ….
Information is matched; interactions enabled; transactions made, all orchestrated in the one conversation flow. ��This is how value gets released. But you need it all. | The LLM’s will be open source, customized, trained. That takes time and effort. Not one LLM, but multiple LM’s being trained. That requires LLMOPs. | Model accuracy, intent fit, outcomes all have to be assured and controlled for. Custom data, values, numbers etc. | All this has to be performative, low latency, and super low cost. Scale a suboptimal architecture and see what happens to your AWS bill. |