1 of 21

Confidence Interval

An AI-supported reasoning scaffold

for college students (and others)

Vaughn Tan

Singapore Government Centre for Strategic Futures

Winding Road Pte Ltd

UCL School of Management

© Vaughn Tan 2025-2035

2 of 21

An AI-powered tool

(a reasoning scaffold)

that helps users go from a thinly articulated argument like this

to a robustly conceptualised argument like this

in 15 minutes.

© Vaughn Tan 2025-2035

3 of 21

Reasoning scaffolds help humans reason explicitly and therefore more effectively about subjective things, not just objective truths.

Subjective reasoning is the most frequent cause of coordination problems —

yet remains a theoretically and practically under-explored area in epistemics.

Learn more about reasoning scaffolds here

© Vaughn Tan 2025-2035

4 of 21

Teaching undergrads through PhDs and exec MBAs in universities since 2008, and strategy consultant for govts/multilats/corps/startups since 2012.

tl;dr:

Ability to do subjective reasoning is declining.

Decline observed in students and non-students.

Leads to long-term coordination problems.

© Vaughn Tan 2025-2035

5 of 21

Boeing’s series of 737 MAX disasters show how coordination breaks down when different parts of an organisation unintentionally pursue different and conflicting subjective definitions of “good performance”: Increasing output volume, or compliance, or controlling production costs.

© Vaughn Tan 2025-2035

6 of 21

Decline accelerating with advent of AI tools.

Working on the problem for the last 3 years.

Testing mechanisms for enabling better human subjective reasoning since late 2024.

© Vaughn Tan 2025-2035

7 of 21

An early version of a reasoning scaffold.

© Vaughn Tan 2025-2035

8 of 21

Tested reasoning scaffolds using pen-and-paper mechanisms first, for

faster iteration and ability to modify scaffolds in real-time.

Real-world teams, real-world subjective reasoning about strategy for public utilities and economic development.

Very strong initial feedback:

Teams pirated test materials for own use, and participants requested more workshops using the test mechanism.

© Vaughn Tan 2025-2035

9 of 21

The Future of Life Foundation supported final testing and prototyping of a self-service, AI-supported version designed for college students.

© Vaughn Tan 2025-2035

10 of 21

© Vaughn Tan 2025-2035

© Vaughn Tan 2025-2035

11 of 21

Design principles:

  • Explicitly scaffold reasoning steps.
  • Use AI systems as a Socratic mirror only.
  • Separate meaningmaking from other work.
  • Entrust meaningmaking only to human users.
  • Don’t try to replace human instructors.
  • Instead, free up instructor time for higher-order coaching on subjective reasoning.

Learn more about meaningmaking here

© Vaughn Tan 2025-2035

© Vaughn Tan 2025-2035

12 of 21

Feedback from student + instructor testers:

Massive instructor time savings (~ 60 min/stu).

Time savings for students (~45 min/stu).

Significant output quality improvements.

Applies to many types of subjective arguments.

© Vaughn Tan 2025-2035

13 of 21

“The 15 minutes I spent on this was more helpful than an entire year-long module on critical thinking.”

44 testers so far from educational institutions.

Over 95% want their institution to provide access.

© Vaughn Tan 2025-2035

14 of 21

Also being tested by government strategy groups, public utility operations teams, development agency leadership teams, consultancies, and startups.

© Vaughn Tan 2025-2035

15 of 21

“This is already helpful for clarifying our thinking. If we had a custom version for [our specific context] we would buy access for our middle management layer.”

© Vaughn Tan 2025-2035

16 of 21

Across the board, useful for improving arguments about subjective value judgments:

Justifying a business model

Deciding on an investment commitment

Choosing between product dev options

Arguing for a particular policy direction

© Vaughn Tan 2025-2035

17 of 21

Fully functioning prototype

ci1.vercel.app

© Vaughn Tan 2025-2035

18 of 21

Whitelist request form

forms.gle/Q8PYkEVFnoDaoggC6

© Vaughn Tan 2025-2035

19 of 21

Read the development note

vaughntan.org/aiux

© Vaughn Tan 2025-2035

20 of 21

Get in touch to learn more about the project.

I’d love your help with introductions to:

Institutions that want to test.

Funders who want to subsidise access.

People/teams who want to build it out with me.

© Vaughn Tan 2025-2035

21 of 21

21

web

email

LinkedIn

Farcaster

X

Bluesky

© VAUGHN TAN 2024-2045

© Vaughn Tan 2025-2035