Está en la página 1de 44

Don’t Bite the Whole

Apple!

Frank Doherty
PMW-189
Deputy Program Manager

dohertyf@spawar.navy.mil

Corinne C. Segura
Integrated Computer Engineering, Inc.
A Wholly Owned Subsidiary of ASC
cory.segura@iceincusa.com
Purpose

To provide guidelines for selection of


Best Practices and Lessons Learned
to improve the chance of software
project success.

-2-
OUTLINE

• Historical Perspective
• Identification of Main Problems
• Development of Solution Set
• Solution Selection
• Implementation
• End Results
• Lessons Learned

-3-
Past History Shows…….

• Only 16% Of Software Projects Are Expected To


Finish On Time And On Budget(1)
• Projects Completed By The Largest US Organizations
Have Only 42% Of Originally Proposed Functions(1)
• An Estimated 53% Of Projects Will Cost Nearly 190%
Of Their Original Estimates(1)
• In Large Companies, Only 9% Of Projects Will Be
Completed On Time & On Budget(1)
• Canceled Projects - $81 Billion Loss to US in 1995(1)
• Average MIS - 1 Year Late, 100% Over Budget(2)

(1) Standish Group International Report: “Chaos”, as reported in Open Computing. Copyright SPC
(2) Capers Jones, Applied Software Measurement, McGraw-Hill
-4-
The Defense Science Board
Said (over 10 years ago)

“Today’s major problems with military software development


are not technical problems, but management problems.”
[Defense Science Board Task Force on Military Software, executive summary.

And after ten years it still holds true!

-5-
How Do We….

• Define the main issues


• Prioritize the areas needing focus
• Establish criteria for selecting a
solution set
• Measure effectiveness of changes
implemented

-6-
PMW-189 Experience

-7-
Our Mission

• PMW 189 is responsible for development,


production, and deployment of fleet
cryptologic/signals intelligence/direction
finding systems (5 ACAT III/IV programs, $1.2
Billion budget in FYDP)
• Software is the backbone of the capability
provided by PMW189
– Complex, operationally demanding
applications
– Historically software has been our “Achilles
Heel”

-8-
Prioritized Main Issues

• Our programs have a high rate of


rework and scrap
• Our products have numerous defects
after delivery to the fleet user
• Seem to have difficulty in finding the
right developer who can deliver a
quality product
• We are reactive rather that proactive

-9-
Ex S Re
pe taf qs
rie f
nc
Ch
e ur
n
Po
U
o
De nd
rly
ve efi D
Pr lo ne
ef
in
oc pm d ed
es en
se t R
Po s s
eq
Fo or F
Po

De r Co oun
or

ve nt da
V

lo in tio
&

pm ue n
V
Cause and Effect

en d
t
And Scrap
High Rework

- 10 -
Phase One
Focus on improving the
development and requirements
processes

- 11 -
Phase One
Implementation Strategy

• Baseline current performance


• Focus on top risk items
• Select a maximum of four changes
for simultaneous implementation
within the program
• Motivate the entire team for
implementation of the selected best
practices

- 12 -
Baseline Current Product

• Utilizing the Software Program


Managers Network (SPMN) 16 Point
Plan to:
– Assess program health and status
– Realize true state of software processes
– Identify top risks
– Validate current schedules and estimates

- 13 -
Process Improvement Focus Was
Implementation Of SPMN 16 Point Plan ™

16 Point Plan are 16 proven practices from industry & SPMN Airlie Council

Project Integrity Construction Integrity Product Stability & Integrity

• Adopt Continuous Risk • Adopt Life Cycle • Inspect Requirements and


Management Configuration Management Design
• Estimate Cost and Schedule • Manage and Trace • Manage Testing as a
Empirically Requirements Continuous Process
• Use Metrics to Manage • Use System-Based Software • Compile and Smoke Test
• Track Earned Value Design Frequently
• Track Defects against • Ensure Data and Database
Quality Targets Interoperability
• Treat People as the Most • Define and Control Interfaces
Important Resource • Design Twice, Code Once
• Assess Reuse Risks and
Costs

• Incorporates proven commercial best practices • Not proprietary


• Focuses on high-leverage, bottom line • Specific
• Brings big savings • Measurable
• Uses flexible templates • Attainable
• Realistic
• Readily implemented

- 14 -
Baseline Process

• First assessment was held at Program


Office and was supported by the
developer
• Utilized the ICE/SPMN tool and
questionnaire
• Ratings based on the following scale:
1 = Component not evident
2 = Respondent aware of requirement. Not yet implemented
3 = Implementation under way. Not yet in place.
4 = Implementation evident, but not effective
5 = Effective implementation

- 15 -
Em
pir Fo
ic a rm
Me l C al R
os
tr ic ta i
s -B nd s k M
as S an

0.0
1.0
2.0
3.0
4.0
5.0

ed c hed agem
De Pr o u en
f ec jec le Es t
t E t t im
Pe
op
Tra ar n
ed
Ma
n a ...
le A c king V a agem
wa A lu e
r e gains e Tr a nt
P t c ki
Co r ogr a Qua ng
En
d n f ig mM lity
Sy t u T
s te o End r atio anag ...
mA R nM e
an ment
Da
ta
r c h equi
itec r em a ge
`

Fo an en me
rm d D tur e ts n
al D ata Ba Tr a t
ef i ba s c
niti s e ed So ing
o Inte f
Co V is ib n &
s t-
Jus le a
Co
nt
r op t...
e r ab
16-Point Plan Assessm ent

tif ie nd In r ol o
f ilit
da s pe Inte y
nd c
Qu table r...
alit
Fr e
qu M an For m
y - J Des ig
us n
en ag ti
tC ing al Ins f ie...
om pe
pile Tes ts c tio
an
dS a s ns
As
mo se
ke t
Te s
s tin
g
First Assessment Results

- 16 -
Observations

• Unwarranted optimism plagues the


program
• Overall processes and requirements
not defined
• Schedules and estimates did not
have a firm basis
• Difficulties in understanding the
impacts of process shortfalls
• Product functionality missing

- 17 -
Subsequent Assessments

• After baseline assessment, the


contractor implemented changes in
12 of the 16 areas
• Teams were formed to work on
assigned focus areas
• Risk Management was imbedded into
daily business practices
• Two follow-on assessments were
conducted at 4 month intervals to
measure progress
- 18 -
Fo
Em rm
al
pi R
r ic is
al k
M C M
et os an
5

2
ag

0
ric ta
s- nd em
B Sc en
as
ed he t
Pr d ul
oj e
ec . ..
Ea tM
D rn an
ef
ec ed ag
tT Val
...
ra ue
Pe ck Tr
op in
le g ac
A ki
A ga ng
w i
ar ns
e tQ
Pr
og ua
ra li .
m .
M
an
a.
..
Project Integrity Summary

First

Third
Second

- 19 -
C
En on
d fig
to ur
En at
Sy
d io
st R n
em eq M
an
A u i
5.0

D r ag
4.0

e 3.0
rc

2.0

1.0

0.0
at
a hi m em
an tec e nt en
d tur s t
Fo D e Tr
rm at
a B ac
a in
al
D
ba se g
ef
se d
in In So
i t f..
Vi tio er
o .
si n p
bl & er
C e C ab
os an on ili
d tr ty
t-J In o l
us sp of
tif ec In
ie ta
d bl
...
an e
d D
Q es
ua ig
lit n
y-
Ju
st
i..
.
Construction Integrity

First

Third
Second

- 20 -
Product Quality

5.0

4.0

3.0

2.0

1.0

0.0

o ns
ct i s
s pe s set g
First
a l In asA es tin Second
o rm est
s T
F T oke Third
ing dS
m
n ag n
Ma ea
m pil
o
n tC
e
qu
Fre
- 21 -
Did It Work ?

• Significant improvements in
– Software durability
– Software reliability
– Training
– Documentation
• Cultural resistance diminishing

- 22 -
Phase One Lessons Learned
• Keys to successful implementation - - top
management leadership, training, and joint PMW,
SSC and industry teaming
• Real pay-offs occur when risk mitigation is solved
by prevention-based technical strategies
• Risk can be a guide to needed process
improvement/application of Best Practices
• Can implement only 1-2 Best Practices at a time
• There is benefit in fostering an environment
where everything is challenged
• Use of outside experts to assess risks is a good
idea
• We want to repeat our successes - not our
failures
- 23 -
Priority Two Issue

Our products have numerous defects


after delivery to the fleet user

- 24 -
L
Re ack Ins
gre of pec
Te ssio
Lac tion
sti ko s
ng n f

Ina
Te de
Ine Tes
st q xp ters
Sc uate eri
he en
du
le
ce

I
Te nad Po
st eq or
Pr ua
ac te
V&
V
Cause and Effect

tic
es
High

Defects
Number of

- 25 -
Phase Two
Focus on product functionality,
reliability and test strategies

- 26 -
Phase Two Prioritization

• Phase Two prioritization


– Near Term:
• Reliability Growth
• Good Testing Practices
– Mid-Term
• Unwarranted Optimism
• Accountability of Practices

- 27 -
Established a Working Group

• Team members were experienced in


software development practices and
included an independent member
• Informally assessed test practices
• Identified issues and obstacles
• Defined issues
• Prioritized recommendations for near and
mid term accomplishment
• Identified required resources for
implementation

- 28 -
Reliability Growth Plan

• Near Term Recommendations


– Conduct an independent Pre-OPEVAL
Assessment
– Examine fault profiles
– Establish a reliability improvement scale
– Establish quality gates for each DT
evolution
– Conduct a team meeting with OPTEVFOR
to review OT rules and how they will be
applied to program

- 29 -
Reliability Growth Plan
(cont’d)

– Establish a cut-off period for any future


developments or ECPs and focus
remaining time on building reliability
– Construct a “backwards” schedule to
determine latest dates for corrective action
– Review defect tracking system and
required metrics to determine discovery
and correction rates
– Validate the test environment

- 30 -
Reliability Growth Plan
(cont’d)

• Mid-term Actions:
– Conduct end-to-end path testing
– Expand white box testing
– Conduct code end-to-end walkthroughs
– Determine software reliability allocations
for all subsystems
– Conduct regression testing on each build
Words of Wisdom: “Being naïve and expecting this to be done by statistically unsophisticated
people who can’t or won’t do the math in order to discover which caveats do and don’t apply in your
case.”

Testing: Best and Worst Practices – A Baker’s Dozen by Boris Beizer

- 31 -
Test Practice
Recommendations
• Change test structure. Add enhanced
negative testing and system level white box
testing
• Enhance current regression test process
• Conduct path analysis and schedule testing
to focus on the top 20% of paths utilized
• Complete a Pedigree Analysis
• Conduct Fagan Inspections on key
threads/critical paths
Words of Wisdom: “ Good testing practices can’t be defined by simple rules of thumb
or fixed formulas. What is best in one circumstance can be worst in another.”

Testing: Best and Worst Practices – A Baker’s Dozen by Boris Beizer


- 32 -
Interim Results

• Reliability Growth is the main focus


– 8% improvement in MTBOMF (sw)
• Defect Discovery Rate
– More robust test environment
– Seen a 48% increase in PCR’s discovered
within first month of implementation
• Defect Removal Rate
– Closure rate is steady
– Fix stage scheduled between major test
events
- 33 -
Priority Three Issue

Seem to have difficulty in finding the


right developer who can deliver a
quality product
We are reactive rather that proactive

- 34 -
Ina Pa
A c de qu
st
qu
Pe
i ate
s i
rfo
Str rm
ate tion an
gy ce

Po Tech
or n
Ina Es iqu
De deq tim es
ve ua
lo
ati
Ex pm te on
pe
rtis ent
e

Po
Co
or Ine ntra
Pr Sele ffe ctin
oc ct ct i g
es ion
s
ve
Cause and Effect

Poor

Practices
Acquisition

- 35 -
Phase Three
Evolve the Program Office into a world
class systems acquisition team by
developing a strong software and
hardware knowledge base.

- 36 -
What’s Next?

• Phase Three
– Focus on required expertise to become a world
class program office:
• Expand knowledge base from hardware intensive to
software intensive acquisition management
• Institutionalize Risk Management across all programs
• Continually measure performance against standards
• Remove Crisis Management from program environments
• Allow managers to focus on what they have to do; Not
which fire has to be put out.
• Build an environment of trust where potential problems
can get a thoughtful analysis

- 37 -
Consider How Far We’ve Come!

HPO

16 Point Plan 2002


Best
Practices
Risk
Strategic Management
Planning

1998
- 38 -
Patterson-Conner
Change Adoption Model

Strategic
Planning Institutionalization
Policies/Support Structures
Commitment

New Employee Orientation


Phase

Risk Adoption Tailored Model


Management Leading and reinforcing change
Degree of Support for Change

Periodic organizational re-appraisal


Trial Use
Best Practices Measurement system
Acceptance

16 Pt Plan Piloting Plan


Phase

Customized training for adopters

HPO Understanding
Process Implementation Approach
Preparation

Detailed course for change agents in the model


Phase

Awareness
Longer Tutorials for different roles in the organization
Quick appraisal to understand the baseline
Contact
Presentations/PR on how it works/Examples

Garcia. Suzanne, Are You Prepared for CMMI?, CrossTalk, Mar 2002

- 39 -
Implementation Strategy

• Assess and baseline acquisition


performance based on SA-CMM and DoD
instructions
– Identify deficiencies
– Provide training
– Reassess to monitor progress
– Conduct gap analysis
• Determine high risk areas and provide
necessary training to construct a plan for
corrective action
• Revitalize the risk management program

- 40 -
Implementation Strategy

• Establish a metrics IPT to provide


performance measurements
• Employ a self evaluation program
• Put into practice the transition
mechanisms required to proceed to
the next level of excellence

- 41 -
Lessons Learned

• Develop a potential solution set by


analyzing cause and effects
• Select two process improvements which
are in alignment with top risks, resource
availability, and team capabilities
• Apply a structured approach in
increments that are attainable and
measurable
– “short term wins” to build enthusiasm and
commitment to change
– ROI

- 42 -
Lessons Learned

• Measure your progress incrementally


and continue to refine your best
practices.
• Broaden your application after you
have proven the concept
• Select the next two improvements
and begin the cycle again.

- 43 -
Lessons Learned

• Bottom line:
– Start small, adopt a process, implement the
strategy and continually refine.
– Don’t become discouraged if improvements are
not realized immediately.
– Get total commitment from every member of the
team.
– Mature your environment and processes in
parallel to obtain institutionalization.
– Employ leadership which creates an
environment that is conducive to change and
maintains a vigilant focus!

“Don’t Bite the Whole Apple!”


- 44 -

También podría gustarte