Today, I had the privilege of attending a one-day seminar put on by IHE USA. IHE, which stands for Integrating the Healthcare Enterprise is one of a few select groups like HL7 that is working towards creating a standardized data exchange framework. I believe, based on my experience as a solution architect designing technology for healthcare environments, there is a chance that interoperability will destroy a hospital's network. There are a multitude of factors as to why I believe this...
Believe it or not, Healthcare Technology is lagging the market substantially. Why this is the case stems from the unique environment in which healthcare technology exists. I figured the best way to illustrate this point to you is to explain my statement through one of my non-award winning stories...
A Non-Award Winning Story About Healthcare Technology....
Go with me on this journey for a second. Imagine you are a producer of widgets for company trying to break into the healthcare space. You are told that your widget needs to be able to communicate "openly", and you are given a standard to follow. You begin to develop your product. You go through the traditional product development steps of selecting your hardware manufacturer, writing your code, and setting up your embedded OS.
Next, you move on to unit testing and lo and behold all of your functions check-out. Finally you develop a set of server software. You get the latest version of Windows Server or of some NIX distro. You code your server-side code, run some more tests, and develop a UI and storage framework.
You now have a product and all is well. You begin to work with marketing, you say something to the effect of "Hey lets sell this product to xyz hospital". You get a meeting with the executive suite, they like your product but there is one minor hitch. Your product isn't FDA approved? What does the FDA have to do with technology? Well my friends any patient related device has to go through a certification process. Ok, that's fine, it knocks you a little off budget. You get the product certified and than elections happen, committee members change and that Standard you designed against is no longer relevant.
So, you may be asking yourself, Phil that's a wonderful story but what the heck does it have to do with Hospital Network design and Interoperability? That's a good question. The reason I told the story I just did was because it is important to understand that all of the regulation, culture, and cost sensitivity has led us to a point where integrated systems don't really exist in a healthcare environment. Sure you will get folks who tout integration through system a to system b, but if you want to connect to system c... Forget a bout it!!!!
However that is changing....
I sat through a full day conference with multiple government, non-profit, and private organizations that are moving towards interoperability. These organizations had 500+ developers doing an all week Connectathon to certify healthcare technology for interoperability.
I for one believe in the need for a change and with the Affordable Care Act and the increased pressure on providers to utilize electronic systems for Healthcare related data, we are about to see a massive influx of data released upon the healthcare networks.
For years now healthcare environments have been designed to be internal bandwidth hogs. Imaging machines push a lot of streaming data and that data in the past was captured and stored as physical media (aka pictures). Now, the electronic capturing of data has been going on for years, and creating some nightmare QoS tagging scenarios I might add, what hasn't been going on for years is the need to share this data.
So imagine for a second you have Billy. Billy is very accident prone and as a result he is constantly getting X-rays, MRI's, and the like. Through some act of genetic wizardry Billy clones himself 10,000 times. Now each of these Billy's has a bunch of data associated with his name. When Billy goes to a different doctor, that data needs to be transferred. Now environments are having to deal with massive upload requirements that are:
- Quality of Service Sensitive
- Reliable (TCP) transmissions
- Hi-Risk Confidential data.
You have now, created three key issues that the Healthcare network was not prepared for. How do you deal with these situations? If you are Cleveland Clinic or some other marquee healthcare organization the issue may not be to bad, you simply plan for and allocate funds. However, what if you are Contoso Regional Medical center. You are already loosing doctors to better paid facilities, you have almost no cash to your name, and all the sudden you need to update to a dedicated fiber pipe, on-site SAN, and security audit's through the wazoo. How are you going to fund this?
Houston We Have Problem
One of the speakers at the conference said that Medicare is adding 10,000 people a day. I'm not sure how that works as that is 3.65 Million a year but who knows I'm not that big into Medicare. The point is, each one of these Medicare recipients brings with them all of the regulatory compliance demands of the government. Quite simply if your environment can't meet the standards of care you won't get paid. So now you have Hospitals and Clinics that have to meet the electronic demands of Federal and State regulations if they want to get paid.
Because of this Capital costs start rolling in. You need to upgrade software, there's a consultant for that. You need to survey and upgrade your network and servers, there's a consultant for that. You need to adhere to all of the new security standards, there's a consultant for that. Finally, you need to educate and change your workflows and culture, there's a consultant for that also.
You are just starting this journey and your budget is eaten up with consulting fees to setup the framework. Now you move into the network, security, server, audit, ect and someone like myself comes in. I help you go through a step by step process: (slimmed down for the sake of brevity):
- Interview the key stakeholders to understand regulations and workflow in order to create requirements.
- Align technologies to the requirements.
- Survey the existing traffic, and project future traffic. I will forecast out QoS, Data Flow, Routing, ect. I will forecast out storage size, computer requirements, and integrations. From here I create an enterprise design. I then work with the client to shop this design.
- Finally We execute the contracts and test the environment.
All of this assumes the organization can afford these technology refreshes. In all reality several of the smaller organizations will collapse and be purchased by the larger regional consortiums.
This post tied in several of the business concepts with the reality of technology within the Healthcare Space. I wanted to show how the adoption of interoperability and data exchange (which has to happen by the way) will significantly impact IT organizations.
Data flow, security, storage, and compute are just a few of the issues we need to address. However very few people are planning for these issues. Many of the IT organizations I deal with are on limited budgets. Why do 10Gb to the edge, when 1Gb will work just fine? Why have 10% excess capacity built into storage and compute? Why wouldn't I go with that Core Switch that will EoL in 1 year. Those are the issues I deal with on a daily basis. There is no good answer.
Truth is we just need to keep banging our drums. Make the best economic business case you can for a future proofed design. The big thing in Commercial Real Estate (CRE) right now is the concept of life-cycle cost. We are putting high-capacity switching gear in a CRE environment not because they need it now but because of the potential future demand. There is nothing like having to go rip a ton of wire and switches out of a building that is 2 years old.
So what are your thoughts on what I just wrote? If you deal with the healthcare environment how do you see interoperability impacting the business in the near future?
Let me know in the comments below!