12
Results of IAC Study of Results of IAC Study of Metrics Metrics in in Electronic Records Management Electronic Records Management (ERM) (ERM) Systems Systems Dr. Rick Klobuchar Dr. Rick Klobuchar Vice President and Chief Technology Vice President and Chief Technology Officer Officer SAIC -Enterprise Solutions Business SAIC -Enterprise Solutions Business Unit Unit 2829 Guardian Lane 2829 Guardian Lane Virginia Beach, VA 23452 Virginia Beach, VA 23452 [email protected] (757) 631-2335 (757) 631-2335 Dr. Mark Giguere Dr. Mark Giguere Lead IT (Policy & Planning) Lead IT (Policy & Planning) ERM E-Gov co-Program Manager ERM E-Gov co-Program Manager Modern Records Programs Modern Records Programs NARA NARA [email protected] [email protected] (301) 837-1744 (301) 837-1744

Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

Embed Size (px)

Citation preview

Page 1: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

Results of IAC Study of MetricsResults of IAC Study of Metricsin in

Electronic Records Management Electronic Records Management (ERM)(ERM)

SystemsSystems

Dr. Rick KlobucharDr. Rick Klobuchar

Vice President and Chief Technology OfficerVice President and Chief Technology Officer

SAIC -Enterprise Solutions Business UnitSAIC -Enterprise Solutions Business Unit

2829 Guardian Lane2829 Guardian Lane

Virginia Beach, VA 23452Virginia Beach, VA 23452

[email protected]

(757) 631-2335(757) 631-2335

Dr. Mark GiguereDr. Mark GiguereLead IT (Policy & Planning)Lead IT (Policy & Planning)ERM E-Gov co-Program ManagerERM E-Gov co-Program ManagerModern Records ProgramsModern Records ProgramsNARANARA

[email protected]@nara.gov(301) 837-1744(301) 837-1744

Page 2: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

Introduction and Principal ConclusionsIntroduction and Principal Conclusions

How does one measure the impact of an ERM system to the How does one measure the impact of an ERM system to the bottom line business or mission of an organization?bottom line business or mission of an organization?

What is the business case for an enterprise ERM system?What is the business case for an enterprise ERM system? Principal conclusions:Principal conclusions:

• No silver bulletNo silver bullet• No universal COTS tool or productNo universal COTS tool or product• No one metric captures the success of an ERM system and No one metric captures the success of an ERM system and

relates unambiguously to the bottom linerelates unambiguously to the bottom line Notwithstanding: Some common categories of metrics in use todayNotwithstanding: Some common categories of metrics in use today Some metrics less burdensome to capture than othersSome metrics less burdensome to capture than others Some metrics just reflect a measure of IT system performanceSome metrics just reflect a measure of IT system performance Some metrics reflect mission success more directly than othersSome metrics reflect mission success more directly than others

• Measurement of ERM performance is currently immatureMeasurement of ERM performance is currently immature• Most measurements tend to be IT-related rather than related Most measurements tend to be IT-related rather than related

to records management itselfto records management itself• Valid comparisons of ERM practices across organizations are Valid comparisons of ERM practices across organizations are

difficult to make, and probably should not be made difficult to make, and probably should not be made

Page 3: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

Bottom LineBottom Line The inescapable conclusion:The inescapable conclusion:

• There is no simple, single answer!There is no simple, single answer!• There is no Swiss Army Knife-like toolThere is no Swiss Army Knife-like tool• Tradeoffs must be made to arrive at metrics Tradeoffs must be made to arrive at metrics

that are:that are: Meaningful to measure ERM success (e.g., “good” Meaningful to measure ERM success (e.g., “good” vs. vs.

““bad” metrics), and bad” metrics), and Not too burdensome to capture on an enterprise-wide Not too burdensome to capture on an enterprise-wide

basisbasis• ““What gets measured is what gets done”What gets measured is what gets done”• Aggregation of metrics into a single coherent Aggregation of metrics into a single coherent

picture of bottom line performance ispicture of bottom line performance isproblematic problematic

Page 4: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

Concerns to ConsiderConcerns to Consider

Metrics for Public Services Relating to ERMMetrics for Public Services Relating to ERM• Spirit of the eGovernment initiative is to Spirit of the eGovernment initiative is to

provide a Government that provide a Government that “works better and “works better and costs less.”costs less.”

Quantifiable and well-defined ERM metrics relating to Quantifiable and well-defined ERM metrics relating to capacity, throughput, security (especially data and capacity, throughput, security (especially data and records integrity), assured service availability, records integrity), assured service availability, ubiquitous access, lower cost, improved turnaround ubiquitous access, lower cost, improved turnaround times, etc. are of interest.times, etc. are of interest.

Also concerned about particular metrics that are Also concerned about particular metrics that are unreliable, non-specific, intractable to interpret, or unreliable, non-specific, intractable to interpret, or too burdensome or onerous to collect. too burdensome or onerous to collect.

Page 5: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

Major Factors to ConsiderMajor Factors to Consider Who is the Consumer?Who is the Consumer?

• Nature of the “consumer” is an important factorNature of the “consumer” is an important factor• ““Who” and/or “what” the metrics are samplingWho” and/or “what” the metrics are sampling

““Public at large”Public at large” Specific customersSpecific customers Agency/company employeesAgency/company employees Federal agencies,Federal agencies, Other government agenciesOther government agencies corporations, or corporations, or Foreign users, etc.Foreign users, etc.

What is the ERM Business Practice?What is the ERM Business Practice?• What specific “bottom-line” agency and/or industry business practices the metrics What specific “bottom-line” agency and/or industry business practices the metrics

supported. For example:supported. For example: Servicing FOIA requestsServicing FOIA requests Support for legal discoverySupport for legal discovery Historical researchHistorical research GenealogyGenealogy Auditing and controlsAuditing and controls Regulatory complianceRegulatory compliance Public information disseminationPublic information dissemination Statistical analysisStatistical analysis Archival records managementArchival records management Grants managementGrants management ERM systems operations and managementERM systems operations and management Specific mission support (e.g., medical, environmental, emergency Specific mission support (e.g., medical, environmental, emergency

and disaster, defense)and disaster, defense)

Page 6: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

Principals in Defining ERM MetricsPrincipals in Defining ERM Metrics

Not everything that can be measured Not everything that can be measured needs to be measured nor should it beneeds to be measured nor should it be

Metrics should have a purpose for Metrics should have a purpose for continuing improvementcontinuing improvement

Best to design the capture and Best to design the capture and management of metrics into a system management of metrics into a system upfront or provide for an SLM approachupfront or provide for an SLM approach

Important “paper Important “paper vsvs. electronic” paradigm . electronic” paradigm issues to be understood issues to be understood

Page 7: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

Broad Categories of ERM MetricsBroad Categories of ERM Metrics Access to ERM ServicesAccess to ERM Services AccuracyAccuracy CapacityCapacity Efficiency Efficiency ParticipationParticipation ProductivityProductivity Search and RetrievalSearch and Retrieval SystemSystem User SatisfactionUser Satisfaction UtilizationUtilization Legal Legal **

* Suggested to the IAC team by Robert Williams of Cohasset Associates

Page 8: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

““Good” vs. “Bad” MetricsGood” vs. “Bad” Metrics Many metrics are potentially ambiguous, intractable, unreliable, Many metrics are potentially ambiguous, intractable, unreliable,

or burdensome to captureor burdensome to capture Among the more problematic metrics:Among the more problematic metrics:

• Record search timeRecord search time• Record retrieval timeRecord retrieval time• Number of seats (or licenses)Number of seats (or licenses)• Session time, and the Session time, and the • Raw number of records in the systemRaw number of records in the system

All of the above can be capturedAll of the above can be captured However, interpretation of each can be quite controversialHowever, interpretation of each can be quite controversial

• A long session time, for example, could be indicative of great success A long session time, for example, could be indicative of great success or utter failureor utter failure

• Search times can be curiosity-driven as in surfing the WebSearch times can be curiosity-driven as in surfing the Web• Level of commitment and persistence of user can not be easily Level of commitment and persistence of user can not be easily

measuredmeasured• Some people are just better than others atSome people are just better than others at

“finding things”“finding things”• Training, domain knowledge, and time-of-dayTraining, domain knowledge, and time-of-day

can be important mitigating factorscan be important mitigating factors

Page 9: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

Sample Candidate Metrics for ERM SystemsSample Candidate Metrics for ERM SystemsMeasurement

CategoryMetric Capture

MethodCapture Medium

CaptureBurden

Comments

Access to Services

Hours of operation Manual Periodic audit

Low Almost certainly greatly improved with automation

Access Points Automated System Low Almost certainly greatly improved with automation

Accuracy Percentage of records correctly declared

Manual Periodic audit

High Measure of quality

Percentage of records correctly classified

Manual Periodic audit

High Measure of quality

Capacity Size of holdings, i.e., number of records (possibly by record

type)

Automated System Low No indication of quality

Efficiency Ease of performing daily tasks

Manual Survey High Purely subjective but indicative of success and acceptance of

ERM

Participation Number of seats Automated System Low No indication of quality

Number of people declaring records

Manual Live Oversight

Medium Indicative of acceptance of system

Number of people classifying records

Manual Live Oversight

Medium Indicative of acceptance of system

Number of people retrieving records

Manual Live Oversight

Medium Indicative of acceptance of system

Page 10: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

Sample Candidate Metrics for ERM Systems (cont.)Sample Candidate Metrics for ERM Systems (cont.)Measurement

CategoryMetric Capture

MethodCapture Medium

CaptureBurden

Comments

Productivity Number of requests processed per week

Automated System Low for one system; High

across enterprise

Difficult to measure enterprise-wide across multiple

processes; may only be useful as a sampling metric, e.g., for

FOIA requests only

Search and Retrieval

System search time Automated System Low No indication of quality

System retrieval time Automated System Low No indication of quality

Number of successful searches

Automated System Low Difficult to interpret; returned result is not necessarily

desired result

Number of search indexes

Automated System Low Indicator of complexity and therefore ease of use

Number of classification categories

Automated System Low Indicator of complexity and therefore ease of use

System Throughput, i.e., transactions per hour or

per unit of time

Automated System Low Measures IT performance, not success of ERM

Response time, i.e., time to retrieve a record

Automated System Low Measures IT performance, not success of ERM

Availability, i.e., system uptime

Automated System Low Measures IT performance, not success of ERM

Page 11: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

Sample Candidate Metrics for ERM Systems (cont.)Sample Candidate Metrics for ERM Systems (cont.)

Measurement Category

Metric Capture Method

Capture Medium

CaptureBurden

Comments

User Satisfaction

User satisfaction rating

Manual Survey High Nearly universal metric for ERM exemplars

Utilization Number of people retrieving records

Automated System Low Indicative of acceptance of system; no indication of success or satisfaction

Virtual Visitors Automated System Low Indicative of acceptance of system; no indication of success or satisfaction

Legal Numbers and types of process violations that

are caught, missed, and/or are attempted

Semi-Automatic

System Medium Measure of accuracy and quality of the ERM processes

with potential legal weight, significance, and bearing

Fraction of the inventory of electronic

records within an ERM system that is in

the wrong state

Semi-Automatic

System Medium-High Indicative of the quality of the processes and services provided within an ERM

system

Note: Any of these metrics should be used to measure improvement over time relative to a baseline. The numbers are not meaningful in and of themselves. Additionally, the Study Group determined that there is no universal, “silver bullet” metric.

Page 12: Results of IAC Study of Metrics in Electronic Records Management (ERM) Systems Dr. Rick Klobuchar Vice President and Chief Technology Officer SAIC -Enterprise

SummarySummary Principal conclusions:Principal conclusions:

•• No silver bulletNo silver bullet•• No universal COTS tool or productNo universal COTS tool or product•• No one metric captures the success of an ERM system and relates No one metric captures the success of an ERM system and relates

unambiguously to the bottom lineunambiguously to the bottom line Notwithstanding: Some common categories of metrics in use todayNotwithstanding: Some common categories of metrics in use today Some metrics less burdensome to capture than othersSome metrics less burdensome to capture than others Some metrics just reflect a measure of IT system performanceSome metrics just reflect a measure of IT system performance Some metrics reflect mission success more directly than othersSome metrics reflect mission success more directly than others

•• Measurement of ERM performance is currently immatureMeasurement of ERM performance is currently immature•• Most measurements tend to be ITMost measurements tend to be IT-- related rather than related to related rather than related to

records management itselfrecords management itself•• Valid comparisons of ERM practices across organizations are diffValid comparisons of ERM practices across organizations are difficult icult

to make, and probably should not be made to make, and probably should not be made

Bottom LineBottom Line The inescapable conclusion:The inescapable conclusion:

•• There is no simple, single answer!There is no simple, single answer!•• There is no Swiss Army KnifeThere is no Swiss Army Knife-- like toollike tool•• Tradeoffs must be made to arrive at metrics Tradeoffs must be made to arrive at metrics

that are:that are: Meaningful to measure ERM success (e.g., Meaningful to measure ERM success (e.g., ““goodgood”” vs. vs.

““badbad”” metrics), and metrics), and Not too burdensome to capture on an enterpriseNot too burdensome to capture on an enterprise--wide wide

basisbasis

•• ““What gets measured is what gets doneWhat gets measured is what gets done””•• Aggregation of metrics into a single coherent Aggregation of metrics into a single coherent

picture of bottom line performance ispicture of bottom line performance isproblematic problematic