39

Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

  • Upload
    doandan

  • View
    213

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment
Page 2: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Current Practices are Threatening Past Performance as an Effective Tool

Breakout Session #: D01

Gary Poleskey, Colonel, USAF (Ret) Vice President, Dayton Aerospace, Inc. CPCM, Fellow

Date: Tuesday, July 26

Time: 11:15am-12:30pm

1

Page 3: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

About Dayton Aerospace

• Small veteran-owned business established in 1984

• Provide management and technical consulting services

– Specialize in hard-to-do tasks requiring experienced acquisition and logistics people

• Highly Experienced – average over 30 years – AFMC Center Commanders (previously product,

logistics, and test) – PEOs, System Program Directors, Product Support

Managers, and key program managers – Lead functional experts – program, center, and

command level • Balanced Perspective

– Broad experience with both Industry and Government organizations

Experience that matters… solutions that count!

We provide government and industry teams with

reach back to former senior level personnel who have “been there,

done that.” 2

Page 4: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Why are we having this discussion? Avoid Past Mistakes

3

“Those who fail to learn from history are doomed to repeat it”

– Winston Churchill

Page 5: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Why are we having this discussion? Avoid Past Mistakes

4

“Those who cannot remember the past are condemned to repeat it.”

– George Santayana (1905)

Page 6: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Past Performance Tools Under Siege Overview

5

• Poleskey Disclaimer • Why & How Was Past Performance

Policy Changed in 1988? • Baseline Past Performance Evaluation

Process • Troubling Past Performance Policy

Changes • Concerns & Consequences • Alternative Recommendation • What Did You Learn Today?

Page 7: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Poleskey Disclaimer I’m not pining away for the “Good Old Days”!

• My Background – I was a member of Air Force Tiger Team in 1987

• Examine the treatment of past performance in source selection and recommend changes

– Helped write the revised USAF Past Performance policy

– PRAG Chair on first major Source Selection using the “new” past performance assessment process

– Been intimately involved over the nearly 30 years since the policy’s creation

• BUT………. – Strong believer in continuous improvement – Strong believer in flexible policy – Dedicated to telling you four things you did not

know before this session started!!

6

Page 8: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Past Performance Tools Under Siege Overview

7

• Poleskey Disclaimer • Why & How Was Past Performance

Policy Changed in 1988? • Baseline Past Performance Evaluation

Process • Troubling Past Performance Policy

Changes • Concerns & Consequences • Alternative Recommendation • What Did You Learn Today?

Page 9: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Why & How Past Performance Policy was Changed – 1988 Tasking & Findings

8

• Air Force Systems Command Commander’s frustration – “I know things about these Companies and Programs that

are never presented to either the SSAC or to the SSA. Why is that???”

• Study Team’s findings: – Process was very “vertically focused” – identify a past

contract that was exactly like the planned new one – Relevancy was very product focused

• Aircrew Training System = Aircrew Training System • Army Training System ≠ Aircrew Training System • Development contracts ≠ Production contracts

– Past Performance Evaluators tended to be very junior members of the Government team

– Very difficult to obtain access to knowledgeable people – Thus, only big positives or big negatives were ever raised

Page 10: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Why & How Past Performance Policy was Changed – 1988 Major Study Changes – Part 1

9

• Goal: Raise stature and perspective of past performance evaluation

• Established new risk factor – Technical Rating – Proposal Risk – Added: Past Performance Risk

• Past Performance Risk assessed against source selection criteria

– Evaluation became a horizontal skills assessment vs. vertical product assessment

– Gather higher fidelity information on how well offerors had demonstrated the skills and expertise to be successful in future

• Created Performance Risk Analysis Group (PRAG) to assign performance risk rating

– Staffed with more experienced people to evaluate information – Select evaluators with knowledge and experience with technology

and product or service involved in the source selection – Provide integrated and consistent picture to the decision makers

Page 11: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Why & How Past Performance Policy was Changed – 1988 Major Study Changes – Part 2

10

• Goal: Address Data Source Problem • Team was not anxious to introduce a new Past

Performance data base – Many had failed of their own weight – But: the need to collect contemporaneous

information was great • Established new Contract Score Card System –

CPAR – Contract Performance Assessment Report – Nine Principal scoring areas result of brainstorming

important program “issue areas” • CPAR 3-signature structure designed to ensure

accuracy – Program Manager – Contractor – PEO or PM’s Boss

Page 12: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Why & How Past Performance Policy was Changed – 1988 Major Study Changes – Critical Point

11

• AFSC Commander’s “Ah Ha” moment

with PRAG and CPAR – Marginal and poor current performance

would place winning new business at risk

• This Linkage is also the reason these changes have been deployed throughout the Federal Government for over 25 years & endured – until now ?

Page 13: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Past Performance Tools Under Siege Overview

12

• Poleskey Disclaimer • Why & How Was Past Performance

Policy Changed in 1988? • Baseline Past Performance Evaluation

Process • Troubling Past Performance Policy

Changes • Concerns & Consequences • Alternative Recommendation • What Did You Learn Today?

Page 14: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Past Performance Evaluation Process Sequence of Events

13

Categorize & Evaluate

Data

Determine Relevancy

Make Preliminary Assessment

Identify Concerns to

Offers

Offerors Provide

Feedback

Assign Final Past

Performance Confidence

Page 15: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Baseline Past Performance Evaluation Process Section M Examples

14

• (USAF) Aircraft Avionics Modification RFP ($50 Mil) – Recency: Five years (Active or completed within period) – Relevancy: Past Performance evaluation will be

conducted using Section M sub-factors • Systems Engineering • FAA Airworthiness Assessment • Military (Mil Hdbk 516) Airworthiness Assessment • Aircraft Integration • Training Device Integration

• (Army) Excalibur 1b

– Recency: Three years (Active or completed within period) – Relevancy: The Government will consider the relevancy

of the data as it relates to the present solicitation (Clear reference to Section M).

• Compliance with Performance Specifications • Producibility (Including transition to production) • Management Oversight • Systems Engineering

Page 16: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Past Performance Evaluation Process Contract Relevancy Matrix

15

Sub-Factors 1 2 3 4 5 6

Contract 1 X X X X

Contract 2 X X X X X

Contract 3 X X X X

• X

• X

• X

• X X X

• X

• X

Contract N X X X X

Total 3 6 4 3 4 5

Prime Contracts

Teammate Contracts

Page 17: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Limited Confidence

Past Performance Evaluation Process Inside the Evaluator’s Mind

16

Software Development Relevancy Past Performance

Quality Inputs Judgment Confidence Assessment

CPAR Blk #14A (2)

4. Yellow 8. Green

Questionnaire

11. Green 19. Yellow 25. Blue

Total Inputs

5

(1 = Low -- > 5 = High)

Contract 4=5 Contract 8=3 Contract 11=2 Contract 19=4 Contract 25=1

Ingredients • Complexity • Lines of Code • Program Stage • Re-Use

8 4

25 19

11

Page 18: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Past Performance Evaluation Process Scoring Roll-up

17

Sub-Factors 1 2 3 4 5 6

Contract 1 X X X X

Contract 2 X X X X X

Contract 3 X X X X

• X

• X

• X

• X X X

• X

• X

Contract N X X X X

Total 3 6 4 3 4 5

Prime Contracts

Teammate Contracts

Sub Sub Sub Neu Sat Sat

Substantial Confidence

Page 19: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Past Performance Tools Under Siege Overview

18

• Poleskey Disclaimer • Why & How Was Past Performance Policy

Changed in 1988? • Baseline Past Performance Evaluation

Process • Troubling Past Performance Policy Changes

1. Relevancy Assessment Criteria 2. Relevancy Assessment Scoring 3. Confidence Ratings

• Concerns & Consequences • Alternative Recommendation • What Did You Learn Today?

Page 20: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Troubling Policy Changes 1. Relevancy Assessment Criteria

Rating Description Changes

USAF 2008 and Prior

“The Past Performance Evaluation will be accomplished by ……..focusing on and targeting performance which is relevant to Mission Capability sub-factors and the Cost factor.”

Instructions substantially unchanged since 1988 Study was implemented

DoD 2011

• “The criteria to establish what is …relevant shall be unique to each source selection and stated in the solicitation”

• “…consideration should be given to those aspects of an offeror’s contract history that would give the greatest ability to measure whether the offeror will satisfy the current procurement.”

• “Common aspects of relevancy include similarity of service/support, complexity, dollar value, contract type, and degree of subcontracting (or) teaming.”

• No mention made of source selection criteria Factors or Sub-Factors

• Only two of the examples are actually common “aspects” that could relate one procurement to another procurement

• Others describing contract

DoD 2016 Essentially the same language as DoD 2011 No Change

19

Page 21: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Troubling Policy Changes 1. Relevancy Assessment Criteria – My Perspective

20

• USAF 2008 & Prior – Focus on Mission Capability Factors and Sub-Factors

was done exactly because they ARE the criteria that provide “the greatest ability to measure” future success

• DoD 2011 & 2016 – Revised language provides little guidance to Source

Selection teams on how to select criteria in the absence of a reference to Mission Capability Factors and Sub-Factors

– Only two of the example “Common aspects of relevancy” are actually aspects, while other “aspects” drive teams to think vertically – How does past contract relate to new one?

– As in 1987, vertical thinking drives product to product comparison rather than skills and capability comparisons

– Even though the use of Sub-Factors are still acceptable, there is no policy language to encourage teams to think that way

Page 22: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Troubling Policy Changes 1. Relevancy Assessment Criteria – Aircraft Avionics RFP Confusing Relevancy Assessment Criteria Language – Example

21

• Technical Sub-Factors: – Systems Engineering – FAA Airworthiness Assessment – Military (MIL-HDBK-516) Airworthiness

Assessment – Aircraft Integration – Training Device Integration

• Relevancy Assessment – How closely do past products or services relate

to Sub-Factors – Government will only consider specific efforts

(present and past contracts) that involve Avionics and Training Device modifications

Page 23: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Troubling Policy Changes 2. Relevancy Assessment Scoring

Rating Description Changes

USAF 2005

Very Relevant – Relevant – Semi Relevant – Not Relevant (USAF Past Performance Guide)

Definitions substantially unchanged since 1988

USAF 2008

Very Relevant – Relevant – Somewhat Relevant – Not Relevant (USAF Past Performance Guide)

Definition wording streamlined & “SR” redefined

DoD 2011

Alternative 1: Very Relevant – Relevant – Somewhat Relevant – Not Relevant

• Teams given choice between two scoring schemes

• “NR” includes “little” • LPTA relevancy scoring

actually not specified

Alternative 2: Relevant – Not Relevant

LPTA Relevant – Not Relevant

DoD 2016

Alternative 1: Very Relevant – Relevant – Somewhat Relevant – Not Relevant • Alt 1 – no change from 2011

• Alt 2 – Uses confidence term for relevancy scoring

• LPTA relevancy scoring still not specified

Alternative 2: Acceptable – Unacceptable

LPTA Acceptable – Unacceptable

22

Page 24: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Troubling Policy Changes 2. Relevancy Assessment Scoring – My Perspective

23

• DoD 2011 – Trade-off Source Selections:

• Requires buying team to decide if past performance will “require less discrimination” in order to choose between Alternative 1 and 2

• Requires a judgement that cannot normally be made during solicitation development phase

• However, buying team will know that Alternative 2 is easier & faster

– Teams will opt for the path of least resistance – LPTA:

• Assessing past contracts as either Relevant or Not Relevant is reasonable (Does not apply to Performance-Price Tradeoff)

• DoD 2016 – Same issues as DoD 2011 language – Requires team to figure out what acceptable or

unacceptable relevancy might be for Alternative 2 and LPTA

Page 25: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Troubling Policy Changes 3. Confidence Ratings

Rating Description Changes

USAF 2005

High Confidence – Significant Confidence – Satisfactory Confidence – Unknown Confidence – Little Confidence – No Confidence (USAF Past Performance Guide)

Definitions substantially unchanged since 1999 when Confidence Rating introduced

USAF 2008

Substantial Confidence – Satisfactory Confidence – Limited Confidence – No Confidence – Unknown Confidence (USAF Past Performance Guide)

Eliminates distinction between outstanding and good performance

DoD 2011

Substantial Confidence – Satisfactory Confidence – Limited Confidence – No Confidence – Unknown Confidence

No significant change from USAF definitions

LPTA: Acceptable or Unacceptable Departure from Confidence Scoring & equates Unknown Confidence with Acceptable

DoD 2016

Alternative 1 Substantial Confidence – Satisfactory Confidence – Neutral Confidence – Limited Confidence – No Confidence

• Alt 1 – no change to definitions

• Unknown Confidence re-named and moved (See 2005)

• Alt 2 – Eliminates distinction between acceptable and good performance

• LPTA no change from 2011

Alternative 2 Satisfactory Confidence – Neutral Confidence – Limited Confidence – No Confidence

LPTA Acceptable or Unacceptable

24

Page 26: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Troubling Policy Changes 3. Confidence Ratings – My Perspective

25

• USAF 2008 – Losing the ability to distinguish between “Blue” and

“Purple” performance hurt industry and evaluators • DoD 2011

– LPTA: Equating “Unknown” with “Acceptable” will bother some SSAs

• DoD 2016 – Trade-off Source Selections

• Requires buying team to decide if past performance will “require less discrimination” in order to choose between Alternatives 1 and 2

• That judgment cannot normally be made during RFP development

• Alternative 2 loses the ability to distinguish between “Blue” and “Green” performance – really hurts industry and evaluators & looks like LPTA scoring

– Using “Past Performance” scoring for “Experience” is misguided

• Just “doing it” does not equal Confidence • Much better fit as part of Technical or Proposal Risk rating

Page 27: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Past Performance Tools Under Siege Overview

26

• Poleskey Disclaimer • Why & How Was Past Performance

Policy Changed in 1988? • Baseline Past Performance Evaluation

Process • Troubling Past Performance Policy

Changes • Concerns & Consequences • Alternative Recommendation • What Did You Learn Today?

Page 28: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Concerns & Consequences Advances of 1988 Study Are In Danger of Reversal

27

1988 Past Performance Study Policy Changes

Impact Of DoD 2016 Policy Changes?

1. Focus on skills and expertise Focus will be on product characteristics, e.g. product similarity, complexity, dollar value

2. Evaluation became a horizontal skills assessment vs. vertical product assessment

Product to product comparison will drive vertical orientation

3. Use more experienced past performance evaluators

There will be little to no need for senior, experienced evaluators

4. Establish visible link between current performance and future business

A “pass vs. fail” past performance source selection environment will greatly reduce seriousness of CPAR risk for industry

5. Add Past Performance Sub-Factor No impact

Page 29: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Concerns & Consequences Future Quote from Source Selection Authority – 2017

28

“I know things about these companies and programs that are never presented to either the SSAC or to the SSA. Why is that???”

Page 30: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Past Performance Tools Under Siege Overview

29

• Poleskey Disclaimer • Why & How Was Past Performance

Policy Changed in 1988? • Baseline Past Performance Evaluation

Process • Troubling Past Performance Policy

Changes • Concerns & Consequences • Alternative Recommendation • What Did You Learn Today?

Page 31: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Alternative Recommendation Streamline Vice Reversing The Past

30

• Apply DoD 2016 Policy for all Evaluation Factors and Sub-Factors (Emphasis added) to Past Performance – “Factors and sub-factors represent those specific

characteristics that are tied to significant RFP requirements and objectives having an impact on the source selection decision and which are expected to be discriminators or are required by statute/regulation. They are the uniform baseline against which each offeror’s proposal is evaluated, allowing the Government to make a best value determination.” (2.3.1)

– “When developing source selection criteria, consider hybrid approaches, applying subjective and objective criteria as appropriate to evaluate elements of the proposal.”(1.3)

– “Source selections can be simplified when only those requirements that are critical to the user are subjectively evaluated by the SST and the rest of the requirements are evaluated on an acceptable/unacceptable basis”(1.3.1.2)

Page 32: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Alternative Recommendation Hybrid Structure For Past Performance Confidence Ratings Example #1 – Aircraft Avionics Modification ($50 Mil)

31

• Technical Sub-Factors: 1. Systems Engineering 2. FAA Airworthiness Assessment 3. Military (MIL-HDBK-516) Airworthiness Assessment 4. Aircraft Integration 5. Training Device Integration

• Hybrid Approach

– Relevancy Criteria: Sub-Factor level – Relevancy Scoring:

• S/F 1, S/F 2, S/F 3 = (Alt 2) Acceptable – Unacceptable • S/F 4 & S/F 5 = (Alt 1) Four level scoring (VR-R-SR-NR)

– Confidence Scoring • Combine S/F-1, S/F-2, & S/F-3 = (Alt 2) (Sat – Neutral – Limited – No) • S/F-4 = (Alt 1) (Sub – Sat – Neutral – Limited – No) • S/F-5 = (Alt 1) (Sub – Sat – Neutral – Limited – No)

– S/F scoring requires a waiver – In my view SSAs should be given this option

Page 33: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Alternative Recommendation Hybrid Structure For Past Performance Confidence Ratings Example #2 – Excalibur 1b ($500+ Mil)

32

• Technical Sub-Factors: 1. Compliance with Performance Specifications 2. Producibility (Including transition to production) 3. Management Oversight 4. Systems Engineering

• Hybrid Approach

– Relevancy Criteria: Sub-Factor level – Relevancy Scoring:

• S/F 3 & S/F 4 = (Alt 2) Acceptable – Unacceptable • S/F 1 & S/F 2 = (Alt 1) Four level scoring (VR-R-SR-NR)

– Confidence Scoring • Combine S/F 3 & S/F 4 = (Alt 2) (Sat – Neutral – Limited – No) • S/F-1 = (Alt 1) (Sub – Sat – Neutral – Limited – No) • S/F-2 = (Alt 1) (Sub – Sat – Neutral – Limited – No)

– S/F scoring requires a waiver – In my view SSAs should be given this option

Page 34: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

What Did You Learn Today?

33

1. Winston Churchill was not the first person to warn us of the dire consequences of failing to learn from history – Warning applies to Past Performance policy today

2. Why CPARs and Past Performance Evaluation Teams exist

3. Implementation of the current policy poses a threat to effective Past Performance scoring in source selection as well as a risk to the utility of the CPAR

4. There is an alternative to evaluating Past Performance in source selection drawn directly from what the current policy recommends for all other Factors and Sub-Factors – Requires less manpower than traditional scoring – Is much, much more effective than “Alternative 2” scoring – Requires a slight change to the policy to allow Past Performance

scoring at the Sub-Factor level – Strengthens link between past performance track record and the

ability to win new business

Page 35: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Contact Information

Gary Poleskey, Colonel, USAF (Ret) Vice President Dayton Aerospace, Inc. [email protected] 937.426.4300 4141 Colonel Glenn Hwy, Suite 252 Dayton, Ohio 45431

34

Page 36: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Back Up Slides

35

Page 37: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Past Performance Evaluation Process Total Team Evaluation (Hypothetical Example)

36

Sample Source

Selection Sub-Factors

Prime Contractor

ABLE Div. A (Airframe)

Subcontractor ABLE Div. B (Offensive Avionics)

Sub - 1 (Aircrew Training System)

Sub-2 (CLS)

Unnamed Subs

(Radar) (Other

Avionics) 1

Ops Utility X X --- --- N/A

2 Software

Development X X X --- N/A

3 Training Effect. X --- X --- N/A

4 Integ. Log. Sup. X --- X X N/A

Cost Factor X X X X N/A

Page 38: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Alternative Recommendation Hybrid Structure For Past Performance Confidence Ratings Example #3 – Small Diameter Bomb II ($500+ Mil)

37

• Technical Sub-Factors: – S/F-1: Adherence to cost & schedule – S/F-2: Capability to deliver system required by RFP – S/F-3: Systems Engineering – S/F-4: Management Effectiveness

• Hybrid Approach – Relevancy Criteria: Sub-Factor level – Relevancy Scoring: (Alt 1) Four level scoring (VR-

R-SR-NR) – Confidence Scoring

• Combine S/F-1, S/F-3, & S/F-4 = (Alt 2) (Sat – Neutral – Limited – No)

• S/F-2 = (Alt 1) (Sub – Sat – Neutral – Limited – No) – S/F scoring requires a waiver – In my view SSAs

should be given this option

Page 39: Current Practices are Threatening Past · PDF fileCurrent Practices are Threatening Past Performance ... – George Santayana ... – Evaluation became a horizontal skills assessment

Alternative Recommendation Hybrid Structure For Past Performance Confidence Ratings Example #4 – Missile Guidance System ($30 Mil)

38

• Technical Sub-Factors: 1. Guidance System Design 2. Software Design And Re-use 3. Subcontract Management 4. Management Effectiveness 5. Systems Engineering

• Hybrid Approach

– Relevancy Criteria: Sub-Factor level – Relevancy Scoring:

• S/F 1, S/F 4, S/F 5 = (Alt 2) Acceptable – Unacceptable • S/F 2 & S/F 3: (Alt 1) Four level scoring (VR-R-SR-NR)

– Confidence Scoring • Combine S/F-1, S/F-4, & S/F-5 = (Alt 2) (Sat – Neutral – Limited – No) • S/F-2 = (Alt 1) (Sub – Sat – Neutral – Limited – No) • S/F-3 = (Alt 1) (Sub – Sat – Neutral – Limited – No)

– S/F scoring requires a waiver – In my view SSAs should be given this option