ABCDEFGHIJKLMNOPQRSTUVWXYZ
1
Information CategoriesMeasureable ConceptsProject Information NeedsEnterprise Information NeedsPropective MeasuresNotes
2
Schedule and ProgressArchitectural CompletenessHow complete is the architecture model? Does the architecture account for all required functions?

Is the architecture sufficiently complete to proceed with design at acceptable risk?
What is the amount of schedule and design risk for each project?

What is the architecture progress across projects?
Architecture Completeness and
Volatility *

Time to PDR, Time to CDR
The goal is to compress these times while reducing the number of defects.

Update to address Mission Engineering
3
Schedule and ProgressModel CoverageWhat is the extent of traceability across digital model elements? What traceability gaps exist?

What is our progress in completing the digital model?
What is the extent of model traceability for a set of projects?

What is the modeling coverage and progress of the digital engineering capability across projects?

What is the current upper limit of the digital engineering capability?
Model Traceability *

Model Coverage (e.g. modeled elements)
Measurement is against only the digital model elements.

Model elements are created to fulfill the functions and interfaces allocated during the architecture and design phases.
4
Size and StabilityFunctional Size and StabilityWhat is the size and scope for the DE project or product? How much work must be done?

How many functions and interfaces have been identified in the system architecture or design? How much is that changing?

How does DE product size relate to estimates and measures of cost, schedule, productivity, or performance?
Is the current project similar in size and scope to historical projects?

Is the work scope changing? Is the schedule and effort sufficient to address changes?

How does DE product size relate to estimates and measures of cost, schedule, productivity, or performance?
Product Size * (e.g.
Model Elements)

Architecture
Completeness and
Volatility *

Functions Identified

Functional Change
Requests
In development, product size can be determined by a count of model elements.

Function Volatility includes the aspects of continuing to identify new functions and/or having the functional allocation continue to change.

In maintenance, change requests are often used as a measure of work scope.

Need to measure software size prior to, during, and post transformation.

The overall DE benefit may need to measure in smaller component /groups if the entire enterprise / company is unable to aggregate to the whole.
5
Product QualityFunctional CorrectnessAre we finding and removing anomalies early in the life cycle using models and shared information?

Is the quality of the product in question adequate for the product to be used in subsequent phases or activities?

Are requirements being fulfilled sooner and ahead of schedule more often?
How many anomalies were released (escaped) to operations?

Is the use of DE leading to the detection of anomalies earlier in the lifecycle compared to traditional methods or projects)? Has the detection curve shifted to the left?
DE Anomalies * For digital engineering focus on the defects for modeling and simulation (including drawings).
6
Product QualityFunctional CorrectnessHow much rework effort is spent maintaining planned or unplanned changes to DE work products across the life cycle?

Is there an improved architecture reducing sustainability and maintenance?
How much is rework reduced through use of DE?

Can changes to work products be implemented more efficiently and with less effort in a DE environment relative to traditional methods?
Adaptability and Rework *
Acceptance of
Completed Work
Products (e.g. Model
Elements, Artifacts)

Rework or Rework
Defects
Completion of work products requires defined acceptance criteria. Rework is required when the acceptance criteria are not met.
7
Product QualityFunctional CorrectnessWhat traceability gaps or defects exist in the digital model?

Does model traceability support change impact assessments (requirements, design, compliance)?
Is architectural traceability improved using digital engineering methods relative to traditional approaches? Model Traceability *
Traceability
Anomalies
8
Process PerformanceProcess EffectivenessHow many released, validated system definitions/analyzed elements were functionally correct, but returned for rework? Is the organization learning how to reduce the number of defects released to operations? Model Element

DE Anomalies
Model to be organized and documented in an universal method that is understood by the team and other systems. [Open Architecture]
9
Process PerformanceProcess EffectivenessAre we containing defects in early phases using models and shared information? Are we finding and removing defects earlier using digital engineering methods relative to traditional methods? DE Anomalies *

Rework Effort

Reworked Model
Elements
For digital engineering focus on the defects for modeling and simulation (including drawings).

The focus is whether the process is improved using digital engineering, versus the raw numbers.
10
Process PerformanceProcess Efficiency -
Automation
What percentage of artifacts are automatically generated from digital models?

To what extent are artifacts facilitating program reviews?
What is the extent of automation across projects?

How much is automation contributing to meeting performance and quality objectives?

What is the return on investment for DE?

How much can cycle time be reduced through automation of DE?
Product Automation *

Cycle Time
11
Process PerformanceProcess Efficiency - SpeedHow long does it take to deploy an identified feature/capability?

How long does it take to deploy a viable product for operational use after a request is received?

Where is the deployment bottleneck; in planning/backlog, implementation, or deployment of the implemented capability?
How long does it take to develop a DE model or product? Does the DE process performance meet business objectives?Deployment Lead Time *

Cycle Time
Proper analysis also requires an enterprise approach for quantifying size or complexity of work products.
12
Process PerformanceProcess EfficiencyIs productivity improving over time (normalized model element/artifact delivered by effort)?

How many model
elements/artifacts are being produced per release?

How many can be expected to be produced for the next release?
Is productivity improving over time (normalized model element/artifact delivered by effort)?

Is our productivity sufficient to meet our customer's needs?

How much is productivity increased through the use of digital engineering?
Productivity

Model
Elements/Release

Artifacts/Release
13
Technology
Effectiveness
Technology
Performance
What is the runtime performance of the capability or system?

What is the likelihood that runtime performance will meet operational requirements (for each alternative solution)?

Where are the runtime performance bottlenecks, and how can operational performance be optimized?
How much does runtime effect interoperability of the system?
Where is redesign needed to solve compatibility issues?
Runtime Performance*

Elapsed Time
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100