1 of 57

LLM-Powered Shiny apps with ellmer and chatlas

Carson Sievert

Senior Software Engineer

Posit, PBC

2 of 57

LLMs made easy in R & Python

3 of 57

Two packages.

Same goals.

  • One interface to many LLMs
  • Streaming output
  • Tool/function calling
  • Structured data extraction
  • Async/non-blocking
  • And more…

4 of 57

Chatbots made easy in R & Python

5 of 57

Inspiration: sidebot

6 of 57

Inspiration: sidebot

7 of 57

Inspiration: sidebot

8 of 57

Clever use of:

  • System prompt design
  • Tool calling + reactivity

Goal: Demystify sidebot

9 of 57

Hello chat in R/Python

10 of 57

Hello chat in R/Python

Many model providers available:

  • ChatAnthropic()
  • ChatGoogle()
  • ChatOllama()
  • And more…

11 of 57

Streaming output

12 of 57

Stateful chat

Chat objects retain conversation history

13 of 57

Stateful chat

14 of 57

  • Download and install ollama.com
  • Open R & run:

system('ollama pull llama3.2')

install.packages('ellmer')

library(ellmer)

chat <- chat_ollama(model='llama3.2')

Your Turn

15 of 57

Models & prompts

model and system_prompt have massive influence on behavior & quality

16 of 57

Models & prompts

17 of 57

Models & prompts

system_prompt:

the place for you (the developer) to ‘program’ the LLM

18 of 57

Prompt design

Treat AI like an infinitely patient new coworker who forgets everything you tell them each new conversation, and one who comes highly recommended but whose actual abilities are not that clear.

Quote from Ethan Mollick’s Getting Started with AI

19 of 57

Prompt design

Quote from Ethan Mollick’s Getting Started with AI

Two parts of this are analogous to working with humans (being new on the job and being a coworker) and two are very alien (forgetting everything and infinitely patient). Start with where AIs are closest to humans, because that is the key to good-enough prompting.”

20 of 57

Without prompting

21 of 57

Example system prompt

You are a data science educator specializing in statistics. Keep responses brief.

When explaining statistics:

1. Start with a real-world analogy

2. Include minimal mathematical notation

3. Provide a small code example

4. List one key pitfall to avoid

22 of 57

Tasks

Role

You are a data science educator specializing in statistics. Keep responses brief.

When explaining statistics:

1. Start with a real-world analogy

2. Include minimal mathematical notation

3. Provide a small code example

4. List one key pitfall to avoid

Example system prompt

23 of 57

With prompting

24 of 57

  • Open R and run:

library(ellmer)

chat <- chat_ollama(model='llama3.2')

Your Turn

  • Add a system_prompt and start chatting

25 of 57

  • Precise or reliable calculations with data.
  • Accessing up-to-date, private, or otherwise “most relevant” information.
  • Generally accomplishing programmatic tasks at the request of the user.

Problem: LLMs are not good at…

26 of 57

An example

27 of 57

But, weather APIs exist!

28 of 57

Give it to the LLM

29 of 57

There we go!

30 of 57

Several ways to chat.

  • Interactive use
    • The chat() method
    • live_console()
    • live_browser()

  • Programmatic use
    • The stream() method
    • The stream_async() method

31 of 57

R Shiny chatbot (shinychat)

32 of 57

Py Shiny chatbot

33 of 57

Hello Shiny chatbot

34 of 57

Brand new Python docs!

35 of 57

Coming to R soon…

36 of 57

Also, new templates!

37 of 57

  • Go to the 'Generative AI' section of shiny.posit.co/py/templates
  • Pick a template that interests you and play with it.

Your Turn

38 of 57

Combine tools w/ reactivity

39 of 57

Tools + reactivity

  • Let the LLM 'control' the app w/ tools that change reactive state.

  • Two categories of state:
    • Frontend
    • Backend

40 of 57

Front-end:

update*Input()

41 of 57

42 of 57

How might this be useful?

  • Have complex UI (lots of tabs, controls, etc)?
    • Allow users describe to what they want without finding it

  • Combine with input suggestions to:
    • Quickly explore different input values

43 of 57

Backend:

reactiveVal()

44 of 57

Sidebot:

Manage state via SQL query

45 of 57

current_data()

depends on current_query()

46 of 57

Downstream views depend on current_data()

47 of 57

Querychat

Abstracts away

tool + reactivity details into module

48 of 57

Querychat

Gives data schema

to the LLM

(so SQL is accurate)

49 of 57

Querychat

Safely executes SQL through in memory

duckdb database

50 of 57

51 of 57

52 of 57

53 of 57

Beyond Chatbots

54 of 57

Markdown stream

55 of 57

Markdown stream

56 of 57

Summary

  • ellmer & chatlas make it easy to program with LLMs in R/Python 🚀

  • Shiny makes it easy to wrap a web interface around your LLM programs 🌐

  • Do magical things with prompt design, tool calling, and reactivity ✨

  • Easily allow users to chat with data with querychat 💬

57 of 57

Thank you.

Q&A