LLM-Powered Shiny apps with ellmer and chatlas
Carson Sievert
Senior Software Engineer
Posit, PBC
LLMs made easy in R & Python
⌵
⌵
Two packages.
Same goals.
⌵
⌵
Chatbots made easy in R & Python
⌵
⌵
Inspiration: sidebot
⌵
⌵
Inspiration: sidebot
⌵
⌵
Inspiration: sidebot
⌵
⌵
Clever use of:
Goal: Demystify sidebot
⌵
⌵
Hello chat in R/Python
⌵
⌵
Hello chat in R/Python
Many model providers available:
⌵
⌵
Streaming output
⌵
⌵
Stateful chat
Chat objects retain conversation history
⌵
⌵
Stateful chat
⌵
⌵
system('ollama pull llama3.2')
install.packages('ellmer')
library(ellmer)
chat <- chat_ollama(model='llama3.2')
Your Turn
⌵
⌵
Models & prompts
model and system_prompt have massive influence on behavior & quality
⌵
⌵
Models & prompts
⌵
⌵
Models & prompts
system_prompt:
the place for you (the developer) to ‘program’ the LLM
⌵
⌵
Prompt design
“Treat AI like an infinitely patient new coworker who forgets everything you tell them each new conversation, and one who comes highly recommended but whose actual abilities are not that clear.“
Quote from Ethan Mollick’s Getting Started with AI
⌵
⌵
Prompt design
Quote from Ethan Mollick’s Getting Started with AI
“Two parts of this are analogous to working with humans (being new on the job and being a coworker) and two are very alien (forgetting everything and infinitely patient). Start with where AIs are closest to humans, because that is the key to good-enough prompting.”
⌵
⌵
Without prompting
⌵
⌵
Example system prompt
You are a data science educator specializing in statistics. Keep responses brief.
When explaining statistics:
1. Start with a real-world analogy
2. Include minimal mathematical notation
3. Provide a small code example
4. List one key pitfall to avoid
⌵
⌵
Tasks
Role
You are a data science educator specializing in statistics. Keep responses brief.
When explaining statistics:
1. Start with a real-world analogy
2. Include minimal mathematical notation
3. Provide a small code example
4. List one key pitfall to avoid
Example system prompt
⌵
⌵
With prompting
⌵
⌵
library(ellmer)
chat <- chat_ollama(model='llama3.2')
Your Turn
⌵
⌵
Problem: LLMs are not good at…
⌵
⌵
An example
⌵
⌵
But, weather APIs exist!
⌵
⌵
Give it to the LLM
⌵
⌵
There we go!
⌵
⌵
Several ways to chat.
⌵
⌵
R Shiny chatbot (shinychat)
⌵
⌵
Py Shiny chatbot
⌵
⌵
Hello Shiny chatbot
⌵
⌵
Brand new Python docs!
⌵
⌵
Coming to R soon…
⌵
⌵
Also, new templates!
⌵
⌵
Your Turn
⌵
⌵
Combine tools w/ reactivity
⌵
⌵
Tools + reactivity
⌵
⌵
Front-end:
update*Input()
⌵
⌵
⌵
⌵
How might this be useful?
⌵
⌵
Backend:
reactiveVal()
⌵
⌵
Sidebot:
Manage state via SQL query
⌵
⌵
current_data()
depends on current_query()
⌵
⌵
Downstream views depend on current_data()
⌵
⌵
Querychat
Abstracts away
tool + reactivity details into module
⌵
⌵
Querychat
Gives data schema
to the LLM
(so SQL is accurate)
⌵
⌵
Querychat
Safely executes SQL through in memory
duckdb database
⌵
⌵
⌵
⌵
⌵
⌵
⌵
⌵
Beyond Chatbots
⌵
⌵
Markdown stream
⌵
⌵
Markdown stream
⌵
⌵
Summary
⌵
⌵