Archive for the 'HL7' Category

Fast Healthcare Interoperability Resources (FHIR)

fhir-logoThe HL7 FHIR (pronounced “fire”) standard has been under development for a while. It became a Draft Standard for Trial Use (see DSTU Considerations) in Jan 2014. The recent announcement of the vendor collaboration Argonaut Project has fueled some “interoperability excitement”™.

The best technical overview I’ve read is this whitepaper: The HL7 Games Catching FHIR. In particular, it does a good job of comparing FHIR with HL7 v3. Summay:

HL7’s FHIR™ standard has learned from the mistakes of HL7 v3, and is surprisingly delightful.

 

Plug-and-Play Medicine

The MIT Technology Review article Plug-and-Play Medicine claims that:

… a Boston research group has come up with a software platform for sharing information among gadgets …

Uh, what software platform? I discussed the MD PnP program a couple of years ago. Other than the Integrated Clinical Environment (ICE, ASTM F2761:2009) standards work I don’t see a lot of progress, let alone software to look at.  The draft ASTM standard (Dec-2008) is still just a shell. The overall model structure makes sense, but the models themselves are not described in any detail (that I could find anyway).

I have a lot of respect for academic endeavors, but creating a comprehensive standard for something as multidisciplinary and complicated as medical device connectivity will not be an easy task.   I once participated in an ASTM standard that was limited to a single communications protocol, and it took years to finalize.  Developing models for how systems behave and interact is at a level of complexity that I’m not sure can ever really be standardized (how do you nail down a moving target?).

A good example is HL7 V3 RIM (Reference Information Model).  RIM has been under development for more than 10 years and as HL7 RIM: An Incoherent Standard (warning: PDF) points out, many mistakes have been made.  These issues may partially explain the woeful rate of HL7 V3 adoption.

The other thing HL7 V3 shows is the magnitude of work required in these standards efforts. The MD PnP group understands this and that standards are just part of the solution. The slide from here (warning: PDF) summarizes this well:

PnP

Unfortunately, the devil is in all these messy details and is the reason why plug-and-play medicine is still a long way off.

Hat tip: Mike Attili

Dreaming of Flexible, Simple, Sloppy, Tolerant in Healthcare IT

I was recently browsing in the computer (nerd) section of the bookstore and ran across an old Joel Spolsky edited book: The Best Software Writing I.  Even though it’s been about four years, good writing is good writing, and there is still a lot of insightful material there.

One of the pieces that struck a cord for me was Adam Bosworth’s ISCOC04 Talk (fortunately posted on his Weblog).  He was promoting the use of simple user and programmer models (KISS — simple and sloppy for him) over complex ones for Internet development.  I think his jeremiad is just as relevant to the current state of  EMR and interoperability.  Please read the whole thing, but for me the statement that get’s to the point is this:

That software which is flexible, simple, sloppy, tolerant, and altogether forgiving of human foibles and weaknesses turns out to be actually the most steel cored, able to survive and grow while that software which is demanding, abstract, rich but systematized, turns out to collapse in on itself in a slow and grim implosion.

Why is it that when I read “demanding, abstract, rich but systematized” the first thing that comes to mind is HL7?  I’m not suggesting that some sort of open ad hoc system is the solution to The EMR-Medical Devices Mess.  But it’s painfully obvious that what has been built so far closely resemble “great creaking rotten oak trees”.

The challenge for the future of Healthcare interoperability is really no different than that of the Internet as a whole (emphasis mine):

It is in the content and the software’s ability to find and filter content and in the software’s ability to enable people to collaborate and communicate about content (and each other).

I would contend that the same is true for medical device interoperability. Rigid (and often times proprietary) systems are what keep devices from being able to communicate with one another.  IHE has created a process to try to bridge this gap, but the complexity of becoming a member, creating an IHE profile, and having it certified is a also a significant barrier.

Understanding how and why some software systems have grown and succeeded while others have failed may give us some insights. Flexible, Simple, Sloppy, Tolerant may be a dream, but it also might not be a bad place to start looking for future innovations.

Adam also had this vision while he was at Google: Thoughts on health care, continued (see the speech pdf):

… we have heard people say that it is too hard to build consistent standards and to define interoperable ways to move the information. It is not! … When we all make this vision real for health care, suddenly everyone will figure out how to deliver the information about medicines and prescriptions, about labs, about EKGs and CAT scans, and about diagnoses in ways that are standard enough to work.

Also see the Bosworth AMIA May07 Speech (pdf) for how this vision evolved, at least for Google’s PHR.

UPDATE (2/9/09): Here’s a  related article: The Truth About Health IT Standards – There’s No Good Reason to Delay Data Liquidity and Information Sharing that furthers this vision:

We don’t have to wait for new standards to make data accessible—we can do a ton now without standards.  What we need more than anything else is for people to demand that their personal health data are separated from the software applications that are used to collect and store the data.

UPDATE (4/17/09): John Zaleski’s Medical Device Open Source Frameworks post is also related.

Use of an open-source framework approach is probably as good as any. As a management model, I don’t see it as being that much different from the way traditional standards have been developed. Open-source just provides a more ad-hoc method for building consensus. Less bureaucracy is a good thing though. It may also allow for the introduction and sharing of more innovative solutions. In any case, I like visions.

USB plug-n-play (plug-n-pray to some) may be a reasonable connectivity goal, but it does not deal at all with system interoperability. Sure, you can connect a device to one or more monolithic (and stable) operating systems, but what about the plethora of applications software and other devices?  This just emphasizes the need to get out of the “data port” (and even “device driver”) mind-set when envisioning communication requirements and solutions.

CCHIT, The 800-Pound Gorilla

There’s a lively discussion on HIStalk (Readers Write 8/27/08) regarding the merits of CCHIT.  The Jim Tate piece isn’t long, or even that informative. It simply touts CCHIT as an

.. organization that is helping level the playing field and make it safer for clinicians to take the plunge into electronic records.

This seems innocuous enough.  Judging by the negative responses though, some people have real problems with CCHIT. In particular, the Al Borges, MD Response to “CCHIT: the 800-Pound Gorilla” is a detailed point-by-point rebuttal (“CCHIT is simply not necessary.”).

I guess it’s not surprising that the biggest issue seems to revolve around money. The cost of obtaining CCHIT certification has stratified EMR companies (big vs. small) and makes it even more difficult for practices to see a ROI.

Also, the discussion of interoperability standards (or lack thereof) is one of my hot buttons. As Mahoghany Rush (comment #9) says:

Anyone here who talks about being HL7 compliant – and thinks it really solves a problem – has never personally written an interface.

So true.

In this regard Inga asks (in News 8/29/08):

Are lab standards an issue one of the various work groups is addressing? Are the labs on board?

When you say “lab” what you’re really talking about are the large number of medical devices commonly found in both hospitals and private practice offices. As you note, the need for interfaces to these devices is so the data they generate can be associated with the proper patient record in the EMR. This not only allows a physician to have a more complete picture of the patients’ status, but the efficiency of the entire clinical staff is vastly improved when they don’t have to gather all of this information from multiple sources.

The answer to your second question is yes, many “labs” — medical device companies, are actively in involved in the development of interoperability standards.  The EMR companies are also major participants.

There are two fundamental problems with “standards” though:

  1. A standard is always a compromise.
  2. A standard is always evolving.

By their very design, the use of a standard will require the implementer to jump though at least a few hoops (some of which may be on fire).  Also, a device-to-EMR interface you complete today will probably not work for the same device and EMR in a year from now — one or both will be implementing the next generation standard by then.

Nobody dislikes standards. Interoperability is usually good for business. There are two primary reasons why a company might not embrace communications standards:

  1. The compromise may be too costly, either from a performance or resources point of view, so a company will just do it their own way.
  2. You build a proprietary system in order to explicitly lock out other players. This is a tactic used by large companies that provide end-to-end systems.

The “standards” problem is not just a Healthcare interoperability issue. The IT within every industry struggles with this.  The complexity of Healthcare IT and its multi-faceted evolutionary path has just exacerbated the situation.

So, the answer is that everyone is working very hard to resolve these tough interoperability issues. Unfortunately, the nature of beast is such that it’s going to take a long time for the solutions to become satisfactory.

UPDATE (9/3/08): The response to Inga was published here: Readers Write 9/3/08. Thanks Inga!

Connecting Computers to FDA Regulated Medical Devices

Pete Gordon asked a couple of questions regarding FDA regulations for Internet-based reporting software that interface with medical devices. The questions are essentially:

  1. How much documentation (SRS, SDS, Test Plan) is required and at what stage can you provide the documentation?
  2. How does the FDA view SaaS architectures?

The type of software you’re talking about has no real FDA regulatory over site. The FDA has recently proposed new rules for connectivity software. I’ve commented on the MDDS rules, but Tim has a complete overview here: FDA Issues New MDDS Rule. As Tim notes, if the FDA puts the MDDS rules into place and becomes more aggressive about regulation, many software vendors that provide medical devices interfaces will be required to submit 510(k) premarket approvals.

Dealing with the safety and effectiveness of medical devices in complex networked environments is on the horizon. IEC 80001 (and here) is a proposed process for applying risk management to enterprise networks incorporating medical devices.  My mantra: High quality software and well tested systems will always be the best way to mitigate risk.

Until something changes, the answer to question #1 is that if your software is not a medical device, you don’t need to even deal with the FDA. The answer to question #2 is the same. The FDA doesn’t know anything about SaaS architectures unless it’s submitted as part of a medical device 510(k).

I thought I’d take a more detailed look at the architecture we’re talking about so we can explore some of the issues that need to be addressed when implementing this type of functionality.

mdds2.jpg

This is a simplified view of the way medical devices typically interface to the outside world. The Communications Server transmits and receives data from one or more medical devices via a proprietary protocol over whatever media the device supports (e.g. TCP/IP, USB, RS-232, etc.).

In addition to having local storage for test data, the server could pass data directly to an EMR system via HL7 or provide reporting services via HTTP to a Web client.

There are many other useful functions that external software systems can provide. By definition though, a MDDS does not do any real-time patient monitoring or alarm generation.

Now let’s look at what needs to be controlled and verified under these circumstances.

  1. Communications interaction with proper medical device operation.
  2. Device communications protocol and security.
  3. Server database storage and retrieval.
  4. Server security and user authentication.
  5. Client/server protocol and security.
  6. Client data transformation and presentation to the user (including printed reports).
  7. Data export to others formats (XML, CSV, etc.).
  8. Client HIPAA requirements.

Not only is the list long, but these systems involve the combination of custom written software (in multiple languages), multiple operating systems, configurable off-the-shelf software applications, and integrated commercial and open source libraries and frameworks. Also, all testing tools (hardware and software) must be fully validated.

One of the more daunting verification tasks is identifying all of the possible paths that data can take as it flows from one system to the next. Once identified, each path must be tested for data accuracy and integrity as it’s reformatted for different purposes, communications reliability, and security. Even a modest one-way store-and-forward system can end up with a hundred or more unique data paths.

A full set of requirements, specifications, and verification and validation test plans and procedures would need to be in place and fully executed for all of this functionality in order to satisfy the FDA Class II GMP requirements. This means that all of the software and systems must be complete and under revision control. There is no “implementation independent” scenario that will meet the GMP requirements.

It’s no wonder that most MDDS vendors (like EMR companies) don’t want to have to deal with this. Even for companies that already have good software quality practices in place, raising the bar up to meet FDA quality compliance standards would still be a significant organizational commitment and investment.

EHR System Modeling

I’m a regular Slashdot reader and it’s rare to come across a health care related post. So Arguing For Open Electronic Health Records of course caught my eye. I’m sure it was the open standards aspect that attracted them, but I also wanted to point out why the use of software modeling is so important to the development of EHRs.

The Tim Cook post is interesting in several respects.

The first is the reiteration of the importance of the “lack of true interoperability standards” and its affect on adoption of EMR. I’ve talked about this numerous times.

Another important point is that even though open source licensing may be free, the real costs of implementing any EHR system (i.e. going paperless) are significant.

The importance of understanding and communicating the “semantic context” of patient data is also a key concept.

The goal of the openEHR open-source project is to provide a model and specifications that capture patient data without loss of semantic context. A “two-level modeling” approach is used (from here):

Two-Level Software Engineering
(click on the image to see it at full resolution)

Within this process, IT developers concentrate on generic components such as data management and interoperability, while groups of domain experts work outside the software development process, generating definitions that are used by systems at runtime.

If you’ve done any work with Microsoft’s WPF, this model should look familiar. Separation of responsibilities (designer vs. developer) is one of the fundamental shifts in GUI development that XAML provides. Separating the domain experts from the developers when building a health care IT system is also clearly beneficial.

No matter how good the openEHR model is, it unfortunately has the same adoption problems as many other health care interoperability systems: competing “standards”. For example, HL7 V3 Reference Information Model (RIM) and CEN 13606 have the same goals as openEHR.

Developing software systems based on conceptual models of the real world is not new. For example, the OMG Model-driven Architecture (MDA, also see here):

These platform-independent models document the business functionality and behavior of an application separate from the technology-specific code that implements it, insulating the core of the application from technology and its relentless churn cycle while enabling interoperability both within and across platform boundaries.

These types of systems not only provide separation of responsibilities but are also designed to provide cross-platform interoperability and to minimize the cost of technology changes.

The future of inter-operable EHR systems will depend on choosing a common information/behavior model so that the benefits of these development technologies can be realized. The challenge is to make the use of a framework that implements that model something that all stakeholders find advantageous.

HL7 Interfacing: The last mile is the longest.

Tim Gee mentions the Mirth Project as a cost effective solution for RHIOs (regional health information organizations). In particular, he notes that the WebReach appliance is “ready to go” hardware and software.

I’ve recently started looking at HL7 interface engines for providing our ICG electronic records to customer EMR systems. I’ve mainly been evaluating Mirth and NeoIntegrate from Neotool.

One of the Neotool bullet points about HL7 V2 states:

Not “Plug and Play” – it provides 80 percent of the interface and a framework to negotiate the remaining 20 percent on an interface-by-interface basis

Since HL7 V2 is the most widely adopted interface in the US, that last 20% can be a significant challenge. This is one of the primary purposes for HL7 integration tools like Mirth and NeoIntegrate — to make it as easy as possible to complete that last mile.

If you look closely at the Mirth appliance literature you’ll see this in the Support section:

For customers requiring assistance with channel development, WebReach consulting and engineering services are available, and any custom work performed by WebReach can be added to your annual support agreement.

They’re providing a turn-key hardware and integration engine system, but you either have to create the custom interfaces yourself or hire them (or someone else) to do it for you.

<AnalogyAlert>
This means that you have bought the hammer and identified the nail(s) to pound in. All you need to do now is find and hire a carpenter to complete the job.
</AnalogyAlert>

This really shouldn’t be that surprising though. Custom engineering and support is the business model for the WebReach Mirth project and I’m sure a large revenue generator for Neotool.

There is certainly great value in being able to purchase a preconfigured and supported HL7 interface appliance. Just be aware that it’s not quite ready to go.

Update 17-Dec-07:

If anyone has experience using HL7 integration engines that they’d like to share, I’d love to here for you (preferably through the comments so they’re shared, but private mail is also fine). In particular, I know there are a number of competing offerings to the ones mentioned in this post, and it would be good to know if they are worth evaluating. Thanks!

Healthcare Un-Interoperability

Or maybe that should be “non-interoperability”? Anyway, I have ranted in the past about the state of the EMR industry. I thought I’d add a little meat to the bone so you could better appreciate the hurdles facing device interoperability in healthcare today.

Here’s a list of the standards and organizations that make up the many components of health information systems. I’m sure that I’ve missed a few, but these are the major ones:

Medical Coding

  • SNOMED (Standardized Nomenclature for Medicine)
  • LOINC (Logical Observation Identifiers Names and Codes)
  • ICD9/10 (The International Classification of Diseases)
  • CPT (Current Procedural Terminology)

Organizations

  • FDA CDRH (Food and Drug Administration Center for Devices and Radiological Health)
  • NHIH (National Health Information Network)
  • HIMSS (Healthcare Information and Management Systems Society)
  • CCHIT (Certification Commission for Healthcare Information Technology)
  • PHIN (Public Health Information Network)
  • VISTA (Veterans Health Information Systems and Technology Architecture)

Standards

  • HL7 (Health Level Seven: v2 and v3)
  • HIPAA (The Health Insurance Portability and Accountability Act of 1996)
  • 21 CFR Part 11 (FDA/HHS Electronic Records and Signatures)
  • IEEE-1073 (Point of Care Medical Device Communications)
  • IHE (Integrating the Healthcare Enterprise)
  • DICOM (Digital Imaging and Communications in Medicine)
  • HITSP (Healthcare Information Technology Standards Panel)
  • EHRVA (HIMSS Electronic Health Record Vendors’ Association)
  • NCPDP (National Council for Prescription Drug Programs)
  • openEHR (International Foundation that promotes Electronic Health Records)
  • CEN (European Committee for Standardization)
  • CCR (Continuity of Care Record)
  • ANSI X12 (Electronic Data Interchange)
  • MLLP (Minimal Lower Layer Protocol)
  • ebXML (Electronic Business using eXtensible Markup Language)

This list does not include any of the underlying transport or security protocols. They are either data formatting (many based on XML) or specialized messaging systems.

The diagram below gives an overview of how many of these standards are related (from an IEEE-USA purchased e-book — copying granted for non-commercial purposes):

Taxonomy of Core Standards for the NHIN

I don’t know about you, but trying to make sense of all these standards and protocols is not an easy task. A discussion of next generation PHRs summarizes the situation well:

… not only is information scattered, but standards for defining and sharing the data are still evolving; where standards exist, many of them predate the Internet. Standards about how to define consistently the information (clinical standards) and to transmit and exchange the information (technical standards) are not yet formalized and agreed upon.

The point about predating the Internet is an important one. This particularly pertains to HL7 v2.x which still uses ASCII delimited messages for transmission over serial lines. For all you 21st century programmers that may have never seen one before, here’s what an HL7 v2.x message looks like:

MSH|^~\&|AcmeHIS|StJohn|ADT|StJohn|20060307110111||ADT^A04
|MSGID20060307110111|P|2.4EVN|A04PID|||12001||Jones^John|
|19670824|M|||123 West St.^^Denver^CO^80020^USAPV1||O
|OP^PAREG^||||2342^Jones^Bob|||OP|||||||||2|||||||||||||||
||||||||||20060307110111|AL1|1||3123^Penicillin
||Produces hives~Rash~Loss of appetite

HL7 v3 uses XML for it’s message format but it has not been widely adopted yet. A good history of HL7 v2 and v3, and an explanation of their differences, can be found here (pdf).

HL7 v2 is commonly used in hospitals to communicate between medical devices and EMR/HIS systems. Even though the communications framework is provided by HL7, new interfaces must still be negotiated, developed, and tested on a case-by-case basis.

Most of the large EMR companies provide HL7 interfaces, but many of the smaller ones do not. This is because hospitals are not their primary market so they don’t generally need device interfaces. These EMRs are essentially clinical document management, patient workflow, and billing systems. The only external data they may deal with are scanned paper documents that can be attached to a patients record. The likelihood that they would conform to any of the standards listed above is low.

I’m not sure things will improve much with the recent PHR offerings from Microsoft (HealthVault) and Google (Google Health — not yet launched) . Microsoft appears to be embracing some of these standards as discussed in Designing HealthVault’s Data Model, but there are a couple of telling comments:

Some of the data types we needed in order to support our partners’ applications where not readily available in the standards community.

Our types also allow each vendor to add “extensions” of their own making to item data – so to the extent that we are missing certain fields, they can be added – and the industry can rally around those extensions if it makes sense.

Microsoft says they are not the “domain experts”, so they’re leaving it to the industry to sort it all out. Great! This is probably the same attitude that got us to where we are today.

Hopefully you can now see why I’ve used the words “mess” and “chaos” to describe the current situation. The challenges facing interoperability in healthcare are immense.

Subscribe

Categories

Twitter Updates