1 of 5

Excellence Principle: {what mantra summarizes the prescription, goal and control plan?}

Theme:

Date:

Owner:

Team:

Measure/Descriptive Analysis

Current State Details:

{Insight into nature of above bounds, priorities, and any inter-relatedness/feedback among above. This should inform descriptive measures of candidate Utilities / levers.

Predictive Analysis

Root-Cause mapping: Relative-Impact (RI) Fishbones allow you to capture your early speculations on the potentially complex role of predictors in a visual form.

Relational Connections:

{Insight into the impact that candidate Utilities / Levers, have on the Priority 1 (focal) Objective; I.e. Connections.

This should include descriptions of any non-linearities, uncertainty, effect lags and possible O-I feedback.

A useful approach is use of a Relative-Impact

Fishbone diagram. You might include views that

suggest correlations (e.g., scatter plots) as well.

Define

{Insight how the problem is perceived, including what performance improvements are desired; providing input into candidate Objectives for further consideration. Such Objectives are distinct from, but a step towards prescriptive “goals”. Here it is critical to outline BOTH Managerial Objectives (e.g., ROI to be maximized, turnover to be minimized, recruitment increase, loss reduction, market capture growth, etc.), as well as Analytical Objectives, if distinct (e.g., R2 increase, accuracy maximization, Type-II error reduction, maximizing the ration of between to within group variance, etc.). The table below can be used to track these ideas (you may have more of one than another). NOTE: IF you are doing a substantial amount of Predictive Analytical work, or exploratory Descriptive work, regardless of whether you are trying to model a continuous variable or a classification, you must outline some metrics useful in gauging the performance of your analysis; Both in the managerial and the analytical sense, we have not hope of improving things if we can’t measure how good they are and how good they can get}.}

Background and Problem:

{Insight how the current (aka, pre-analysis) context developed, and existing trajectories. Clear and tangible depictions, leveraging effective visual depictions of critical processes and outcomes are particularly valuable here}

Improve

Goal and Future State:

{A reiteration of the Priority 1 Analytical Objective, and implications for where the Priority 1 Managerial Objective can rationally be taken. Description of any potential issues in timing, and short- vs. long-term expectations on the approach of that Goal in terms of changing operational conditions and possible non-linear returns to effort}. Expectations regarding the post-implementation state should go beyond just the approach towards the Priority 1 Goal. Tradeoffs expected with regards to other candidate Objectives should be outlined, as well as shifts in process bottlenecks and possible unintended consequences suggested by inherent risk to proposed changes. Any anticipated next stage target Objectives, possibly part of the lower Priority set outlined here, can also be detailed; This is a key opportunity to Scrutinize the prescription posed, and can point to a return to some earlier assumptions }

Integrated Analysis (Prescriptive)

Manifested System Dynamics :

{Manifest a depiction of any nuanced interdependencies not yet captured under the prior point, with emphasis on any critical levels of risk / uncertainty of impact, constraints, lags and feedback. This is your opportunity to outline technical aspects of how you actually go about the modeling task. Any limits imposed on the estimation (e.g., number or size of groups in classification, number of levels in a neural network, etc.). If applicable, combined-issue constraints (e.g., fixed pie, critical ratios, etc.) often point to I-I and feedback mechanisms. Causal Loop Diagrams should be presented, IF I-I or O-I feedback is relevant.}

Explicated Decisions / Actions / Countermeasures:

{What analytical results did your modelling efforts yield? How did your models perform, in terms of the Analytical Objectives chosen? Where possible, validate analytically (Explicate) optimal levels of key and leverageable Utility factors, outlined earlier; or specific countermeasure plans deemed relevant to shifting the system towards an improved state. Projected level of Priority 1 Objective upon ideal implementation of these}

Control

Sustain / Requirements for Control:

{Once implemented, how will improvements be measured, and how will checks to the maintenance of these acquired gains be put in place? What safeguards supporting further continuous improvement will be implemented?}

A

P1

P1

P2

P3

D

C

A

O

Ut

Co

M

S

E

Since you don’t yet know which factors will be most important in analysis, towards your above objectives, it is important to outline some candidates in a table like that below. If doing Predictive analysis, whether predicting a measured continuous or interval data, or training a classification engine on existing nominal groups, Utilities are going to include the predictors used in modelling (specifically they imply yet-to-be-determined weights on those predictors, which will come out of analysis). If you are doing Prescriptive (e.g., true Optimization with managerial outcomes), Utilities are the things you hope to change towards a managerial objective. If you are combining both methods, your listed Utilities may serve both purposes.}

A lot of room is afforded to this section, since there is typically a great variety of content that can be shared from the analysis done. We keep this section open to all forms of Predictive and Prescriptive approaches and results, in both graphical and tabular presentation.

CRITICALLY!! As you build out each section in this A3, you must remove the guiding text in RED, and any pre-existing figures/tables (which are simply placeholders)

Managerial Objectives

(Fnd / Mns)

Trns

Plst

Fit

Current State (w/ units)

Bounds

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Analytical Objectives

(Fnd / Mns)

Trns

Plst

Fit

Current State (w/ units)

Bounds

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Utilities

T/P/F

State (& Bounds)

Utilities

T/P/F

State (& Bounds)

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

Connections (and Utilities / Objectives involved)

Current State (& Slack)

 

 

 

 

 

 

You may also find it useful to include a table, such as that presented below, to take note of complicated relationships between utilities, and how much they might be modified in the future (current state slack)}

* Note: Prioritization of Utilities as elements of analysis are assessed in terms of Transparency(T) {can we see the data}, Plasticity(P) {can do we have influence in how the data can be used} and Fit(F) {does analysis of this factor and influence around it align with user or organizational goals}. On a scale from 1 to 5, assessments of “1” imply a strong reason for prioritization based on a given dimension (T,P or F). “5” implies a low prioritization based on that dimension. Because these are distinct dimensions, any given Utility might have mixed rationale for its consideration in analysis (e.g. T/P/F appears as 2/1/5).

2 of 5

Excellence Principle: {what mantra summarizes the prescription, goal and control plan?}

Theme:

Date:

Owner:

Team:

Measure/Descriptive Analysis

Predictive Analysis

Define

Improve

Integrated Analysis (Prescriptive)

Control

A

P1

P1

P2

P3

D

C

A

O

Ut

Co

M

S

E

Managerial Objectives

(Fnd / Mns)

Trns

Plst

Fit

Current State (w/ units)

Bounds

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Analytical Objectives

(Fnd / Mns)

Trns

Plst

Fit

Current State (w/ units)

Bounds

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Utilities

T/P/F

State (& Bounds)

Utilities

T/P/F

State (& Bounds)

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

 

/ /

 

Connections (and Utilities / Objectives involved)

Current State (& Slack)

 

 

 

 

 

 

Objectives

Connections

Utilities

Manifest

Explicate

Scrutinize

3 of 5

Excellence Principle: {what mantra summarizes the prescription, goal and control plan?}

Theme:

Date:

Owner:

Team:

Candidate Managerial Objectives

Current State Level Measured

Fundamental Bounds (e.g. 0)

Priority

Candidate Analytical Objectives

Current State Level Measured

Fundamental Bounds

Priority

Measure/Descriptive Analysis

Candidate Utilities / Levers

Current State Levels

Fundamental Bounds

Relative Impact

Candidate Utilities / Levers

Current State Levels

Fundamental Bounds

Relative Impact

Predictive Analysis

Define

Candidate Connections

Definition / Utilities involved

Current State Slack

Candidate Connections

Definition / Utilities involved

Current State Slack

Improve

Integrated Analysis (Prescriptive)

Control

A

P1

P1

P2

P3

D

C

A

O

Ut

Co

M

S

E

Objectives

Connections

Utilities

Manifest

Explicate

Scrutinize

4 of 5

Distinguishing Descriptive, Predictive and Prescriptive Analysis

A distinctions between these three genres of analysis helps for one simple reason:

Specific approaches to analysis may answer only specific questions.

Unless we are doing truly-pure exploration (which is unlikely since individuals are typically motivated by at least an implicit question) we should ALWAYS start by outlining what that question is. START by describing an objective in managerial terms, and THEN outline how you might numerically measure successful analysis towards that managerial objective. I.e., Outline your analytical objective (between/within variance in estimated groups, fit measure of a predictive model, classification accuracy, closeness of a performance measure to a theoretical upper bound or benchmark, etc.). We can’t manage what we can’t measure, and we can’t demonstrate value in analysis if we don’t have a yard stick in mind for analytical performance.

Brainstorming around how to gauge the success of your analysis helps point to the kind of analysis you want to perform.

It can also be useful to think of Descriptive, Predictive and Prescriptive analysis as subsequent steps in fully integrated analysis process, with the aim of not only doing well analytically, and so informing managerial decision processes, but also specifically outlining how to change things for the better.

Let’s imagine that we have a data set that consists of five fields:

X1: A continuous measure that we believe someone has some control over (e.g., amount of money spent on a specific type of training)

X2: An interval measure that we believe someone might have some control over (e.g., number of employees assigned to a kind of task)

X3: Another interval measure that we don’t have control over (like “Year”)

G: A nominal designation of grouping (e.g., task number 1, 2, 3 or 4; employee group a, b or c)

Y: A continuous measure that characterizes managerial, or organizational, performance in this setting (e.g., Revenue)

Genres of Analysis as a Process

Typically, we want to begin by getting familiar with our data, so we start with Descriptive Analysis.

The QUESTIONS we are trying to answer are things like: What is the average value of this measure? (Mean?), How can I summarize how its values are distributed? (Fit to Normal or Exponential?), How many things are in each group? (Count, Histogram?), Is there another way to Group things to capture differences in X1 and X2 (Is there an exploratory clustering that informs differently than current G?). These questions are essentially asking “What is the current and historical state/nature of my data?”.

In other words, these are all DESCRIPTIVE Questions, and can benefit from DESCRIPTIVE analytical approaches. Objective performance of analysis here can be soft, but can also be clearly measured in some cases (e.g., semi-exploratory clustering yields measures of between/within variance in groups based on decisions to include variables in cluster (Utilities), Distribution fits have fit measures based on distributions selected for fitness comparison)

Predictive analysis attempts to take a further step, inquiring into relationships among your data fields.

The QUESTIONS we are trying to answer are things like: How much does a small change in X1 impact Y? (Beta coeff. in OLS?), Do X1 and X2 predict groups placement G? Does the impact of X1 on Y depend on X2, or G? (Estimates of the size and significance of interactions as predictors?), How does time (X3) figure in? (ARIMA?). These questions are essentially asking “How can I anticipate future changes in Y as other things change?” These are all PREDICTIVE Questions, and can benefit from PREDICTIVE analytical approaches. All forms of predictive modelling have a variety of fit statistics. These are the analytical Objectives that you are keeping an eye on, in the hope that weights (e.g., Betas) assigned to predictors work out. Such model parameters (e.g., Betas) are the levers (Utilities) available in pursuit of model fit Objectives.

Prescriptive analysis attempts to go still further, suggesting a very specific plan of attack for changing things towards managerial objectives.

The QUESTIONS we are trying to answer are things like: How much should we change X1 and X2 to impact G and/or Y? (Improvement to Y by means of targeted changes to X1 and X2, subject to limits on each). These questions are essentially asking “How can I make future changes in Y by changing other things?” These are all PRESCRIPTIVE Questions, and benefit from PRESCRIPTIVE approaches. Result: Suggested changes to X1 and X2 (Utilities) towards real improvement in Y (Objective).

Predict

Y=f(β,X)

Describe

X and Y

Prescribe

ΔX to ΔY

5 of 5

Simulation Optimization (SA, Genetic algorithm, etc.)

Uncertainty in objective, utility decision variables, and/or connections / constraints?

Closed form solutions derivable? (e.g. by calculus)

Derive optimal solution formulaically

No

Yes

Yes

No

No

Yes

No

Yes

Risk profiles for optimal solution and likely alternates exist?

All objective-utility connections are continuous functions?

Linear or non-linear hill climbing

Ad-hoc sensitivity / what-if analysis

Discontinuous / multimodal landscape search (GA, SA etc.)

Comprehensive landscape examination

Low possible solution count to search time ratio (<100K/sec)?

Yes

No

Descriptive Analysis

Predictive Analysis

Prescriptive Analysis

Interval Regression

Assess Distributions: Stats by group, histograms, confidence ellipses, measured kurtosis, etc., nominal spatial distinctions / composition (heat maps, etc.)

Classification

Transform data: Box-Cox ,

Yeo-Johnson, etc.

Remaining variation approximates desired profile (e.g. Normal)?

Above actions account for all observed variation ?

Aggregate: Reduce piece-wise measure variation through factor estimation (e.g. PCA)

Measurement error partly addressable by multiple groupings?

Yes

No

No

Yes

No

Yes

Measurement error partly addressable by multiple measures?

Yes

No

Delineate: t-Test, ANOVA, Group observations by logical or objective criteria (e.g. Cluster analysis)

Clean: Ensure data is free of errors in record, ambiguity in non-numerical equivalency, unit inconsistency, etc.

No

Yes

Sample tabular data has been visually examined / vetted?

Outcome to be predicted is not categorical?

No

Yes

Low record to dimension ratio (<1000)?

Support Vector Machine (SVM)

Need clear boundaries?

Logistic modeling, Neural networks, (Bayes point machine, if 2-class)

No

Yes

No

Yes

Ordinal Regression

Outcomes are rank-ordered values?

Based on distribution of Y and possible presence of non-linearity, moderation/mediation: Ordinary least squares (OLS), GLM, HLM, Boosted decision trees

Yes

No

Decision forest, Decision jungle

Outcomes are counts?

Yes

No

Poisson or Binomial Regression

Outcomes are time interdependent (e.g. time series)?

Econometric approaches, (e.g. ARIMA), system dynamics / feedback model estimation

Yes

No

Need residual examination?

Yes

No

No

Yes

All objective-utility connections monotonic?

Note: In all cases above, complex model estimation can require considerable training. Large dimensionality (parameters to estimate) will require larger quantities of data and time, to avoid misleading over-estimation.

Note: In all cases non-monotonic cases above, complex terrain search can require considerable investments in time. In some cases, the total number of possible solutions cannot be comprehensively examined. Benchmarking and the use of heuristics should accompany any such efforts.

Note: Fundamental to descriptive analysis is access to appropriate data, representative of the objectives to be pursued, and the levers (utility variables) available to exert change. You can’t manage what you don’t measure.

Methods associated with Descriptive, Predictive and Prescriptive Analysis

Excerpted from Figure 1. of Bendoly, E. 2019. A Framework for Analytical Approaches. International Institute for Analytics, Research Brief, Research & Advisor Network. September.