1 of 12

The Limits to Thinking Like a Computer Scientist

Neil Chilson

Acting Chief Technologist

Federal Trade Commission

nchilson@ftc.gov

@TechFTC

@telecomlawyer

2 of 12

Disclaimer

My views are my own and do not necessarily reflect the views of the FTC, its Commissioners and staff, or anyone else for that matter.

3 of 12

Overview

  • Computational Thinking Is Important and Useful
  • Computational Thinking Has Limits, Say Hayek and the Lego Movie
  • Policymakers Should Use Regulatory Humility and Analytical Egalitarianism

4 of 12

Computational Thinking - What Is It?

Computational thinking is the thought processes involved in formulating a problem and expressing its solutions in such a way that a computer—human or machine—can effectively carry out. - Jeanette M. Wing

Often decomposed into three parts

  • Abstraction - my focus
  • Automation
  • Analysis

WARNING - I’m building a bit of a strawman

5 of 12

Abstraction is Key

  • Recognizing patterns in a class of problems and creating a generic solution
    • Recognize what changes - the data
    • Recognize what doesn’t change - the algorithm

Legal analogy: Going from common law to codified law

  • “Black Box Thinking”
    • Big problems can be decomposed into a set of smaller problems
    • Once smaller problem abstracted, we treat the solution as a black box

Legal analogy (imperfect): scope of review on appeal

Abstraction is model building

6 of 12

Computational Thinking is Useful and Valuable

A huge swath of human endeavors benefit from computational thinking

  • Does computational thinking = good thinking?
  • Helps understand computing - critical as “software eats the world”

Benefits for the legal world

  • Computation is changing how lawyers practice
    • Automating the rote parts of legal practice
  • Computation is changing what lawyers practice
    • As software eats the world, raises many legal questions

7 of 12

But….Computational Thinking Has Limits

(like all thinking)

8 of 12

Hayek’s Knowledge Problem

F.A. Hayek - Use of Knowledge In Society (1945)

  • Regulator cannot collect enough information to accurately predict future state
    • Information is widely distributed and often contradictory
    • Collection is inefficient
    • Much knowledge is tacit - it cannot be explained even by the owner of the knowledge
    • Much knowledge is irreducible
  • Much of complexity theory builds on Hayek’s insights
    • Order can emerge from decentralized decisionmaking
    • Final results sensitive to initial conditions
  • Thus: more information can be taken into account through decentralized processes -> Better solutions

Note: Hayek describes the limits of all thinking, not just computational thinking

9 of 12

The Lego Movie

10 of 12

Regulatory Humility

  • Recognize limits of what regulation can accomplish
    • Good intentions aren’t sufficient
  • Focus on actual or likely harms
    • Reduces knowledge problem
  • Use the right tools
    • Prefer those that facilitate emergence

11 of 12

Analytical Egalitarianism

  • When we try to model policy outcomes, (a dynamic system) the model of the problem should include the solution - and possibly the modeler!
  • We often model the societal problem, but treat the policy prescription as outside the model.
    • But regulators are humans, too, mostly. They have incentives and flaws. The regulatory “solution” will reflect this.
    • And the system reacts and adapts to policy interventions. Problems if we can’t anticipate those adaptations.

12 of 12

Conclusion

  • Computational thinking is important, but in law and policy, it has its limits.
  • We should prefer local decision making over global, and systems that support emergent order to those that attempt to constrain or replace it.