Upload
dvreeman
View
660
Download
1
Embed Size (px)
DESCRIPTION
Citation preview
Clinical LOINC® Tutorial Panels, Forms, and Patient Assessments
Clinical LOINC Meeting
Daniel J. Vreeman, PT, DPT, MSc Assistant Research Professor, Indiana University School of Medicine Associate Director of Terminology Services, Regenstrief Institute, Inc
07.15.2010 Copyright © 2010
Overview • Background • Standard Panels in LOINC • Enhanced Panel Model for Patient
Assessment Instruments • Current Projects • Lessons Learned
Standard Panels in LOINC Enumerated child elements
Panels (Batteries) in LOINC • Panel term linked to enumerated child elements
– Child elements can be panels themselves (nesting)
• Panel term names (under discussion) – Component often have “panel”, include authoritative source – Property typically “-” because child elements will vary – Scale typically “-” because child elements will vary – Class PANEL.*
• Child elements linked and identified as: – Required (R) Element always reported with panel – Optional (O) Element may not be reported depending on
institutional policies or capabilities – Conditional (C) Element is a key finding and thus should be
assumed to be negative, absent, or not present if panel result does not include data for this element.
Example Panel
Example Panel with Nesting
Clinical Panels
Patient Assessments in LOINC Iterative enhancements of the panel model
Upcoming AMIA 2010 Paper
Introduction • Patient assessments are widely used to measure a broad
range of health attributes – Functional status, depression, health-related quality of life, etc – Survey instruments, questionnaires, assessment forms, etc
• Observations from patient assessments (whether clinician-observed or self-report) are in many respects very similar to other kinds of clinical observations
• Survey instruments have psychometric properties • Question meaning tightly coupled with answers
General Aim: LOINC could serve as a “master question file” and provide a uniform representation
Approach • Iterative refinement of the base panel model as
we added new content – Kept uncovering new wrinkles
• Collaborated with many people – Tom White, CHI Functioning and Disability
workgroup, APSE, AHIMA, CMS, RTI, HL7, HITSP, and others
• Represent the full assessment content with attributes at three levels – Individual item, answer list, panel-specific item
instance
Attributes of Assessment Items • Question (item) name/text
– Exact question text, form-specific display name
• Data type • Definition/description • For numeric values: units of measure, range checks • For categorical results: answers in an answer list • Copyright and terms-of-use notices • HL7 field sub-id • HL7 data types (v2 and v3)
Structured Answer Lists • Many items have highly specialized, fixed answer lists
– Often the answer lists define the meaning of the question – Few are represented by existing codes in reference terminologies
• LOINC has created answer codes where needed – Have “LA” prefix and a mod-10 check digit – Are unique by lexical string (ignoring capitalization) – Intentionally do NOT distinguish based on context-specific meaning
• In some cases, the answer list is identified with a Regenstrief-assigned OID (for HL7 CDA use) – Identify lists as “normative” vs “example”
• Answer list shows sequence, but not bound by it • Store local codes for items and have place to store
universal code (e.g. SNOMED) if we’re able
Attributes of Items in a Panel Instance
• Some non-defining attributes of an item vary by panel – Vary across instruments or different forms of the same
assessment • Represented at the level of the item instance in a panel
– Display name override (e.g. “BMI” vs “Body Mass Index”) – Cardinality – Observation ID in form (local code) – Skip logic – Data type in form – Answer sequence override – Consistency/validation checks – Relevance equation – Coding instructions
Advantages of the Master Catalog • Single database (LOINC) contains the details
about individual observations and sets – In the database, all forms (sets) look the same – Automatic standardization
• Separates the form structure, question details, the rendered version (paper or screen), and the program that manages it
• Can easily reuse observations (and attributes) in different forms/sets
Panels/Forms Available as Separate Download
http://loinc.org/downloads
Rules for Display of Items • SURVEY_QUEST_TEXT (if populated). Used when item is asked as
a question. Sometimes the item has a label and a question, so we store both as [label].[question text]
• DISPLAY_NAME_FOR_FORM (if populated). Provides an override display linked to the instance of the LOINC in a particular form. Allows for presentation variation that doesn’t affect meaning and for where the LOINC naming conventions require some difference b/w the item and the LOINC Component.
• COMPONENT. This is the default display
Pain Presence. Ask resident: “Have you had pain or hurting at any time in the last 7 days?”
Item label = “Body Mass Index (BMI)” LOINC Component = “Body mass index”
Successes and Current Projects
Currently in LOINC • US Government Forms
– CARE, MDSv2, MDSv3, OASIS B1, OASIS C RFC – US Surgeon General’s Family Health Portrait
• Brief Interview for Mental Status (BIMS) • Confusion Assessment Method (CAM) • Geriatric Depression Scale (GDS) • HIV Signs and Symptoms Checklist • Home Health Care Classification • howRU • Living with HIV (LIV-HIV) • Morse Fall Scale • OMAHA • PHQ (9 and 2) • Quality Audit Marker (QAM)
Find them in RELMA
ASPE as Key Supporter • ASPE (Jennie Harvell) has championed use of
HIT standards for assessment instruments in many venues
• Initial Reports – Making the "Minimum Data Set" Compliant with
Health Information Technology Standards – Standardizing the MDS with LOINC® and Vocabulary
Matches
Consolidated Health Informatics
• CHI Goal: – Adopting interoperability standards for all US federal
health agencies
• Adopted LOINC as standard – Laboratory result names (2003) – Laboratory test order names (2006) – Meds: structured product labeling sections (2006) – Federally-required patient assessment instruments
with functioning and disability content (2007)
Many Other Opportunities • PhenX Measures • PROMIS • Neuropsychological testing instruments (APA) • Lots of other commonly-used instruments
(SF-36, etc) • CDC case report and other forms • National physical therapy outcomes database
measures
Challenges and Lessons Learned Corralling the Creativity
Lesson 1 Variation Abounds
Variation Abounds • Despite many instruments now in LOINC,
reuse of items has been minimal – E.g. extremely few of same items b/w MDSv2
and MDSv3 – MDSv3 has greater similarity to CARE, but the
lookback period is different (7D vs 2D)
• We noticed differences that might have been avoided
• Urge developers to weigh the cost of losing comparability before inventing
Original PHQ-9
CARE
MDSv3
MDSv2
MDSv3
OASIS
CARE
Lesson 2 Starting from a uniform data model may bring clarity
A Uniform Data Model Would Help
• We usually started from paper forms, though some instruments had their own software and data structures
• Forced to reconcile many potential discrepancies – “Unknown”/“unable to determine” as answer choices vs flavors of null – How do you store “Other specified ______” – Units of measure implied? – Which text is the item and which is “help”
• Incongruence with the HL7+LOINC model – Items for things that could go in PID, etc – Flat data model vs stacked – Every ‘Check all that apply’ stored as separate yes/no item
• Interoperable data exchange standards haven’t been in the minds of survey developers
• Starting with the LOINC model may elucidate hidden challenges
Many Yes/No Diseases
Lesson 3 IP issues present large challenges
Intellectual Property Issues • Must negotiate separate agreements with each
copyright/IP holder for inclusion in LOINC • Many instruments have difficult restrictions
– Protection against change and attribution are understandable – Some want royalties – Commercial use in LOINC’s context is tricky
• Even more complicated when several instruments included in larger CMS ones (MDS, CARE, etc)
• Funders should require developers to avoid such restrictive licenses
Lesson 4 Always new challenges
Always New Challenges • Answer list sequences
– Same answers across instruments but different order
• Skip logic shown at level of answer – Current strategy is to aggregate up to question level
• Items with pictures • Computer-adaptive testing coefficients and
attributes