1 of 20

An Information Theoretical Perspective on Design

2 of 20

Introduction

  • Information theory provides a mathematical basis for understanding communication and complexity.
  • Design can be viewed as an information transformation process — transforming requirements into specifications.
  • C.E. Shannon’s 1948 model forms the foundation of modern information theory.

3 of 20

Information Theory and Design

  • Design transforms environmental information into system specifications.
  • Noise and model uncertainty act as sources of distortion in this transformation.
  • Suh’s Axiomatic Design emphasizes minimizing information content for simplicity.

4 of 20

Information and Optimization

  • Design optimization increases information content about the system.
  • Information entropy quantifies uncertainty: S = -∫p(x)log₂(p(x))dx.
  • Direct search methods like the Complex method gain about 0.15 bits per evaluation.

5 of 20

Performance and Information Content

  • Performance depends on information content (Ix) and system size (s).
  • Design range expansion increases required information for same tolerance.
  • Adding more parameters expands design space exponentially.

6 of 20

General Model for Performance

  • Performance can be modeled as a function of design freedom (n) and information (I).
  • Parameter influence often follows a negative power law ψ(i)=c·i⁻ᵏ.
  • Riemann zeta function helps describe performance bounds for complex systems.

7 of 20

Example: Processor Performance

  • Empirical example: Intel Pentium 4 and Celeron processors (2004).
  • Performance ∝ log(information) — higher cost → higher performance.
  • Demonstrates linear relation between cost, information, and performance.

8 of 20

Example: Beam Optimization

  • Beam divided into n segments — performance = max load before stress limit.
  • Efficiency increases with more information and parameters up to an optimum.
  • Few bits per parameter are sufficient for optimal refinement.

9 of 20

Aircraft Design Parameter Influence

  • Aircraft design sensitivity shows diminishing influence of additional parameters.
  • Influence follows ψ(i)=c·i⁻¹·³ (bounded performance).
  • Supports idea that only most influential parameters need optimization.

10 of 20

Discussion

  • Design alternates between concept expansion and refinement (C–K theory).
  • New knowledge can be disruptive — shifting the optimal design entirely.
  • Balance between number of design parameters and refinement is essential.

11 of 20

Conclusions

  • Design is a learning process viewed through information theory.
  • Information links optimization, refinement, performance, and cost.
  • Optimal refinement level exists for given complexity — beyond that, expand design space.

12 of 20

Information Theory as a Framework for Design

  • Design can be viewed as an information transformation process — converting requirements into specifications.
  • Shannon’s communication model maps onto design: Designer = Transmitter, Design Process = Channel, System = Receiver.
  • Noise represents modeling uncertainty, incomplete data, and unpredictable external factors.
  • Information theory allows design progress to be quantified as reduction in uncertainty (entropy).

13 of 20

Key Information-Theoretic Concepts in Design

  • Entropy (H): Measures uncertainty or missing knowledge about design parameters.
  • Information (I = H₀ - H₁): Represents knowledge gained through iterations and testing.
  • Channel Capacity: Represents the maximum improvement rate achievable given methods and data quality.
  • Redundancy: Added intentionally for reliability and fault-tolerance, similar to error-correcting codes.
  • Algorithmic Complexity: Shortest description of design structure — a proxy for simplicity and elegance.

14 of 20

Design as an Information Transformation Process

  • Every iteration transforms ambiguous problem statements into structured solutions.
  • Design process can be modeled as I_out = f(I_in, n, η), where n = design variables and η = process efficiency.
  • Entropy decreases (uncertainty reduced) while design information increases (precision and order).
  • Refinement corresponds to compression — retaining essential design knowledge efficiently.

15 of 20

Design Optimization as Information Gain

  • Optimization = iterative discovery of information about optimal configurations.
  • Each evaluation adds bits of knowledge about the design — estimated at ≈0.15 bits per evaluation (Krus, 2004).
  • Information theory helps estimate convergence time and assess algorithmic efficiency.
  • Higher number of variables requires exponentially more information (evaluations) for convergence.

16 of 20

Balancing Design Refinement and Expansion

  • Refinement: Increases accuracy by reducing uncertainty of known parameters (Δx).
  • Expansion: Adds new variables, increasing dimensionality of design space (concept exploration).
  • Optimal balance maximizes innovation potential while avoiding excessive complexity.
  • Information theory offers a way to formalize this trade-off quantitatively.

17 of 20

Performance, Cost, and Information Relations

  • Performance (p) is a function of design information (I) and cost (C ∝ I).
  • Performance initially grows rapidly with information, then saturates — diminishing returns.
  • Explains real-world phenomena: e.g., processor performance vs price shows information saturation.
  • Design cost and manufacturing precision both scale with information density in the system.

18 of 20

Creativity and Knowledge Expansion in Design

  • Creative design alternates between refining known solutions and expanding the design space.
  • When new parameters enter the knowledge domain, they can shift the optimum — disruptive innovation.
  • C–K Theory (Concept–Knowledge) aligns perfectly: knowledge expansion enables new concept generation.
  • Creativity = strategic addition of meaningful information that shifts solution boundaries.

19 of 20

Practical Design Implications

  • Avoid over-optimization: Beyond a certain information threshold, refinements yield negligible gains.
  • Adopt modular or hierarchical design to manage entropy across sub-systems.
  • Alternate exploration (concept expansion) with exploitation (refinement).
  • Quantify uncertainty as entropy; employ probabilistic modeling to minimize it.
  • Optimize information flow between requirement, modeling, and testing stages to reduce loss and noise.

20 of 20

Design as Entropy Management

  • Design can be seen as managing entropy — maximizing useful information while minimizing uncertainty.
  • Design information flow: I_environment → [Design Process + Noise] → I_specification.
  • The most successful designs act as efficient information systems — minimal loss, maximal clarity.
  • Entropy-based metrics could measure design efficiency, creativity, and robustness quantitatively.