35
My Resource for Excellence. Canadian Heritage Information Network Creation of the Collections Management Software Review (CMSR) Heather Dunn, CHIN

Heather Dunn, CHIN

  • Upload
    euclid

  • View
    53

  • Download
    1

Embed Size (px)

DESCRIPTION

Canadian Heritage Information Network Creation of the Collections Management Software Review (CMSR). Heather Dunn, CHIN. Presentation Summary. CHIN and its mandate What is the CHIN Collections Management Software Review? Why did CHIN undertake this project? How was the Review conducted? - PowerPoint PPT Presentation

Citation preview

Page 1: Heather Dunn, CHIN

My Resource for Excellence.

Canadian Heritage Information Network

Creation of the Collections Management Software Review(CMSR)Heather Dunn, CHIN

Page 2: Heather Dunn, CHIN

Presentation Summary

CHIN and its mandate

What is the CHIN Collections Management Software Review?

Why did CHIN undertake this project?

How was the Review conducted? Evaluation team Tools Process Publication

Page 3: Heather Dunn, CHIN

National Centre of Excellence for Museums

of the Department of Canadian Heritage, created in 1972

Develops and provides

skill developmentProducts and services for

heritage professionals

Supports the development, presentation and marketing of digital

innovative technologies

Page 4: Heather Dunn, CHIN

CHIN – 1972-2010 1972 – The National Inventory Program 1982 – Canadian Heritage Information Network 1995 – First Corporate / Professional Web site 1999 – Artefacts Canada National Database 2001 – Virtual Museum of Canada 2009 – Redesign of Web sites, 3 portals:

The Corporate site www.chin.gc.ca

The Professional Exchange www.pro.rcip-chin.gc.ca

The Virtual Museum of Canada www.virtualmuseum.ca

Page 5: Heather Dunn, CHIN

Active Network of Heritage Institutions

More than 1,400 not-for-profit heritage member institutions of all sizes and disciplines, from across Canada

Over 35 years of experience

National and international partnerships

Page 6: Heather Dunn, CHIN

What Is the Collections Management Software Review (CMSR)?

A series of CHIN publications which evaluated collections management software products for museums

Four editions, published between 1996 and 2003

Page 7: Heather Dunn, CHIN

What Is the CMSR (continued)? Assessed the suitability of specific software to museum

discipline, collections size, museum functions, and hardware and software environment

Analyzed vendor reliability, support requirements, customization possibilities, and costs

Ensured that the software met CHIN and international standards, and allowed for importing and exporting data

Page 8: Heather Dunn, CHIN

Why Did CHIN Undertake the CMSR Project? In 1995, CHIN began assisting museums with the

transition of their collections data from the CHIN mainframe to in-house collections management systems

The CMSR was created to assist museums with the transition by helping them select appropriate software

The transition was accomplished by 1998

Today, museums maintain their own collections management data in-house, and periodically upload data to the Canadian national database, now called "Artefacts Canada"

Page 9: Heather Dunn, CHIN

Editions of the Review

CHIN published four editions of the Review:

Edition 1 (1996) reviewed 11 software products Edition 2 (1997) reviewed 16 software products Edition 3 (2000) reviewed 18 software products Edition 4 (2003) reviewed 16 software products

Page 10: Heather Dunn, CHIN
Page 11: Heather Dunn, CHIN

My Resource for Excellence.

Page 12: Heather Dunn, CHIN
Page 13: Heather Dunn, CHIN
Page 14: Heather Dunn, CHIN

Product Reports – excerpt

Page 15: Heather Dunn, CHIN
Page 16: Heather Dunn, CHIN
Page 17: Heather Dunn, CHIN
Page 18: Heather Dunn, CHIN
Page 19: Heather Dunn, CHIN
Page 20: Heather Dunn, CHIN
Page 21: Heather Dunn, CHIN
Page 22: Heather Dunn, CHIN

How Was the Review Produced?

Creation of the Criteria Checklist Request for Information Evaluation Team Demonstrations/Evaluations Publication

Page 23: Heather Dunn, CHIN

Creation of the Criteria Checklist

A list of over 500 functions that can be performed by a collections management system. For example: Does the system allow the user to record the person

who moved an object or specimen lot? Demonstrate. Is it possible for external pre-built thesaural files to

be integrated into the software? Demonstrate.

The Checklist was a key tool in the creation of the Review— it was the basis for comparison used to assess and rate each function performed by the various software packages

Page 24: Heather Dunn, CHIN

Requesting Information from Software Vendors An request for product information was sent to over 40 Collections

Management Software Vendors internationally

The request: outlined the parameters for the evaluation asked the vendor for pertinent information such as vendor product

description, product costs, etc. included the Criteria Checklist to be completed by the vendor

Responding software vendors indicated which functions within the Checklist they could perform, and were scheduled to demonstrate those functions that they claimed to support.

Page 25: Heather Dunn, CHIN

The Evaluation Team The Evaluation Team for the Reviews consisted of:

4 CHIN staff members (3 for some Editions) that were dedicated full-time to the Review

Approximately 20 museum professionals volunteering as reviewers

The volunteer team members were generally from Canadian or U.S. museums that were looking for software

To find volunteer evaluators, CHIN notified the museum community of the opportunity to evaluate collections management software

Respondents included registrars, curators and collections managers. All had background in collections management, but represented wide variety of museum sizes and disciplines.

Page 26: Heather Dunn, CHIN

Product Demonstrations Each software vendor that had responded to the Request for

Information was scheduled to demonstrate their software

For the earlier Editions: Demo at the CHIN office A 2-day demo of all the Checklist items the vendor

supported Approximately 20 evaluators, from Canadian museums –

some local, many remote

For later Editions: Evaluations were “taken to the community” – demos in

conjunction with U.S and Canadian museum conferences (e.g., AAM, CMA, etc.)

A 1-day demo of selected criteria (169 of 500) List of criteria selected for evaluation not shared with the

vendor in advance

Page 27: Heather Dunn, CHIN

Evaluation The Evaluation Team followed the Criteria Checklist,

requesting the vendor to demonstrate only the functions they could support

For each function demonstrated, team members provided: Scores (e.g., Good, Fair, Poor, or Does not Perform, with

the addition of “+” or “-” for more accuracy) Comments to each demonstrated criteria A narrative overall evaluation of the software

The scores were converted to numeric values, averaged and summarized for the “Software Review”

Detailed average scores and comments were made available within “Product Profiles”, one for each software product

Page 28: Heather Dunn, CHIN

Publication Software Review and Criteria Checklist were made

freely available on the CHIN Web site; printed versions sold

Criteria Checklist was also available in a “customizable” version online that allowed museums to select the criteria that they required from the Checklist, and produce a custom report detailing which software products met their selected criteria and how they performed

“Product Profiles” (detailed reports on individual software products) were given to CHIN member museums on request, but sold to others

Page 29: Heather Dunn, CHIN

Why Did Software Vendors Participate? CHIN did not pay vendors to participate or cover their

demonstration costs Vendors saw this as an opportunity to market their

software Vendors that were included in the first editions of the

Review had a head start in an emerging market in Canada

CHIN received requests from vendors wanting to participate in subsequent reviews

Page 30: Heather Dunn, CHIN

Influence on the Software Market

“CHIN Accreditation” – achieved if the software imports/exports data in a format that was compatible with Canada’s national collections inventories

Software products marketed as “CHIN-Accredited”

As a result, many vendors developed an import/export function specifically for the Canadian market, based on CHIN data fields

Page 31: Heather Dunn, CHIN

How Museums Used the Review and Related Products Museums used the Review to shortlist software products

The museum then requested the more detailed “Product Profiles” for their shortlisted systems

The museum requested an in-house demonstration from the vendors of those products that met their criteria

Museums downloaded and modified the Criteria Checklist and used it to score/rate products during their own demos to perform their own evaluation of the systems

Museums used the online Software Selection course to guide them through the software selection process

Positive reviews from museums and from evaluators

Page 32: Heather Dunn, CHIN

Collections Management Software in Canada

A very wide variety of software is used in Canada. CHIN does not endorse any particular software. However, the predominant software packages are:

For small museums: Virtual Collections (GCI Inc.) PastPerfect

For medium to large museums: Mimsy (Willoughby/Selago) The Museum System (Gallery Systems) KE-EMu

Page 33: Heather Dunn, CHIN

How Long Did It Take?

About 9 months per Edition For the 4th Edition:

January 2003 – RFI sent out to vendors February 2003 – Responses received April to June 2003 – Software Evaluations took place

at various locations July-September 2003 – Results were compiled Published in late Fall of 2003

Page 34: Heather Dunn, CHIN

Future?

Plans to update the Criteria Checklist in 2010-2011

Update to reflect new functionality of today’s software products

Page 35: Heather Dunn, CHIN

Thank You!

Heather Dunn

Heritage Information AnalystCanadian Heritage Information Network (CHIN)Department of Canadian HeritageGovernment of Canada

[email protected]