There is a growing recognition that we need an open platform concept to solve e-health interoperability and reuse problems. Some evidence of this I noted in my recent post ‘What is an open platform’, including various US-based cross vendor platform alliances. The great value of a well-designed open platform is that it enables two things:
- a growing platform-based economy of producers to collaborate technically while operating commercially and/or in an open source mode
- adaptation to the constant stream of new requirements.
This is in contrast to the typical de jure standard based on a particular use case: it solves a locked down definition of the problem in a locked down way. Read the rest of this entry »
Last week saw the first major face-to-face international openEHR community meeting, which took place in Lilletstrom, near Oslo, at premises kindly organised by DIPS asa, openEHR Industry Partner and major EHR supplier in Norway.
The remaining hard core at end of day 2 (photo: Dr Shinji Kobayashi)
Read the rest of this entry »
Following on from my post yesterday, Grahame Grieve commented that I had not dealt with issues of stability and commercial acceptability. I had not originally intended to do that, but on reflection, he is right – a standard that is going to survive and be worthy of wide-scale investment can’t be separated from its governance and commercial / legal situation. To address that, I updated the main article, here, and I have also revised the short list below. The shortest form of statement is provided by the blue headings on the left.
Necessary Characteristics for e-Health Standard Longevity and Investibility (v2)
- Platform framework: Does the technology define overall elements of a platform into which recognisable specifics could be plugged, e.g. information models, content definitions, workflow definitions, APIs, code generation tools, etc? OR
- Platform component: Does the technology define something that can be properly integrated into an existing platform definition?
- Semantic Scalability
- Domain Diversity: Does the technology provide a practical method of dealing with potentially massive clinical content diversity?
- Local Variability: Does the technology provide a practical method of dealing with potentially massive localised variability?
- Change over Time: Does the technology provide a practical method of dealing with ongoing change in information requirements due to, new science, -omics, drugs; new clinical protocols and methods; legislative changes; changing patient / consumer needs?
- Does the technology provide an automatable way for clinically complex models to be consumed by ‘normal developers’ to build ‘normal software’, including for the purpose of integrating with existing systems, data sources and sinks?
- Data accessibility: Is the standard designed such that all data elements are easily computationally accessible at the finest granularity?
- Query methodology: Does the technology provide a way to query fine-grained information based on models of content, not physical representation (physical DB schemas, specific XML doc schemas etc)?
- Responsive Governance
- Domain-led requirements: Are requirements statement and prioritisation led primarily by domain experts?
- Industry-led roadmap: Can the roadmap of future releases, i.e. allocation of changes and timing of each release be influenced by industry implementers?
- Release stability: Are releases over time are coherent with respect to each other in ways that enable economic upgrading of implementations (industry side) and smooth deployments of new versions (user / provider side)?
- Responsive feedback mechanism: Does a visible and easy-to-use mechanism for reporting issues and problems with all levels of artefact, i.e. requirements, current release, reference implementation(s)?
- Accountability: Is the governing organisation transparently accountable to key stakeholders for the outputs of the organisation?
- Commercial Acceptability
- Free core IP: Are the standard and its core computable artefacts free to use?
- IP openness futureproof: Are there mechanisms to prevent the IP of the standard and related artefacts from being unilaterally privatised or otherwise made commercially unacceptable over time, including to small companies and user organisations?
A new e-health standard comes along every couple of years. In Gartner hype cycle terms, it starts out on the rise toward the ‘peak of inflated expectations’, then falls into the ‘trough of disillusionment’, before either dying or rising again over the ‘slope of enlightenment’ to a ‘plateau of productivity’. Most standards and e-health technologies (standards + their tools and artefacts) die before getting to this plateau. But why? What’s wrong with them? How can we pick a winner?
(Gartner Hype Cycle, from wikipedia)
The latest hyped e-health technology is of course HL7 FHIR – Fast Healthcare Interoperability Resources.
Read the rest of this entry »
Recently I was asked to provide testimony to the ONC hearings on the JASON report, from an openEHR point of view. I did so on 31 July 2014. The JASON report is entitled “A Robust Health Data Infrastructure”. It surveys the problems of health data interoperability, and proposes the adoption of a unifying ‘software architecture’ as the solution. It also seems to imply a federated health record database. It’s primary assumption appears to be that APIs are the key element of the solution, and that their standardisation will fix the problem. Read the rest of this entry »
A strange thing happens in health IT solution procurement, and by extension government initiatives that seek to influence it. See if you can disagree with the following characterisation of health provider organisations as solution purchasers.
Think You’re Getting What You Want?
CIOs and CMOs have known for years if not decades that:
- the data used inside their institution are their most important asset – either as a productive resource, or at least as an object of risk management. Most today would understand it as both.
- the data used inside their institution is not all produced inside their institution – lab data often comes from external lab companies; they obtain or would like to obtain GP data such as medications lists, problem lists and so on;
- their main vendor solution never supports the data richness actually required by clinicians – it is well known for example that most hospitals contain dozens if not hundreds of hidden specialist’s databases, often referred to as the ‘Access Database problem';
- if they want to switch to a new vendor, the changeover costs related to the data alone are massive and the risks so great that this consideration alone paralyses them for years with the current ineffective solution;
- no one vendor can produce all the functionality they require - even the biggest vendor has a ‘roadmap’ containing numerous features the provider wants today; no large procurement doesn’t involve significant and horribly expensive ‘customisation';
- they cannot possibly afford to buy all the functionality they require in one go – if they wrote down the full wish list and then published a tender for it with an open budget, the resulting price tag would undoubtedly be beyond their means;
- procurement of multiple ‘best-of-breed’ solutions for e.g. inpatient, ER, ambulatory etc, come with huge ongoing cost for data and workflow integration;
- they cannot possibly afford logistically to deploy all the functionality they want in one go – the human costs and challenges of change management not to mention solution integration make this impossible;
- asking for even the smallest change to the data schema or core functionality costs inordinate amounts of money and usually a long wait as well;
- it would be nice if their IT department could have access to ‘their’ data, but of course they can’t, not without the vendor’s say so and price tag.
Read the rest of this entry »