Archive for the 'Microsoft' Category

Controlling a Computer with EMG

Microsoft has applied for a patent for using finger flexing to control a computer.

I’ve talked many times in the past about the use of EEG technology for computer control (brain-computer interface, BCI).

As discussed in the RWW article, there are many challenges to making this work. Just like with EEG, calibration of the EMG sensors and training will require innovative solutions.

It seems to me that this type of gesture-based control has quite a bit more potential than what can be obtained through the interpretation of EEG signals.  In either case, the big benefit of advancements in these human-computer interface (HCI) technologies is that they could ultimately improve communications capabilities for the disabled.

Microsoft BUILD Conference

Wow — talk about drinking from a fire hose. BUILD Conference news and opinions are everywhere.

Fun stuff. It’s going to take a while to digest all of this.

Binary Waveform Data in SQL Server 2008

As Shahid points out in Consider MySQL ‘Archive’ storage engine to store large amounts of med device structured or waveform data, saving physiologic waveform data from a medical device in a MySQL database for archive purposes is a reasonable alternative to using flat files.

In SQL Server 2008 you can have it both ways.  In addition to saving binary data directly in the database you have the option to have a varbinary column stored as a file stream. From the article How to store and fetch binary data into a file stream column:

File stream data can be used from the .NET Framework using the traditional SqlParameter, but there is also a specialized class called SqlFileStream which can be used with .NET Framework 3.5 SP1 or later. This class provides mechanisms, for example, for seeking a specific position from the data.

There are pros and cons to this approach. The backup and transactional issues, along with the performance considerations, all have to be evaluated against your specific system requirements.  Having the SQL Server engine manage the database relationship to the binary files seems like a big advantage over maintaining flat files yourself.

Read the MSDN article FILESTREAM Storage in SQL Server 2008 for all the gory details.

UPDATE (3/25/11): Who’s Got Access to your FileStream Directories?

A Threat Analysis of Networked Medical Devices

Here’s an interesting analysis of security threats within a Windows-based hospital network for embedded medical devices: A threat analysis of critical patient monitoring medical devices.

The threat models are fairly complex and clearly a product of wider enterprise network IT security needs. I’ve discussed some of the other issues of putting medical devices on an institutional network in Networked Medical Devices. Security threats were not covered and this is an important topic for every hospital network.

There are a couple of items in this article worth commenting on.

The top five unmitigated threats were found to be:

The corrective action for the top threat (T002) was (my highlight):

After it was decided to remove all ePHI from the medical device data storage, the risk assessment changed and the threat of the medical device infecting the hospital enterprise network (T017) then became our primary concern.

This may be the “most effective countermeasure possible for HIPAA compliance and protecting patient privacy”, but it is a not practical solution in the real world. Many medical devices store patient demographics. Because the benefits of patient identification outweigh the security risks, this practice is not likely to change in the future.

On these questions:

  1. Can the medical devices be infected from the enterprise network?
  2. Can the medical devices be infected via removable media?
  3. Can infected medical devices propagate malicious software back into the enterprise network?

I generally agree with the conclusions for the device under analysis. The challenge for a hospital is how do you ensure that every networked medical device follows these best practices (communications integrity, hardened OS, clean distribution media, etc.)?

A Medical Device Gateway Data Standard?

The Wipro OEM medical device gateway press release makes it all seem so easy (my highlight):

The device, consisting of interfaces that can feed-in data such as blood pressure, pulse rate, ECG reading and weight from the respective devices, is connected to the gateway that would format it into standard patient information and transmit it to either public health data platform such as Google Health or to private platforms like Microsoft Health Vault.

What exactly is “standard patient information”?  Maybe they’ve finally developed the magic interoperability bullet.  Yeah, right!  I’m sure companies like Capsule see these kind of claims all the time.  Statements like these are unfortunate because they give the impression that health data interoperability is a given. Of course we know that is not the case.

Also, since when is Google Health a public health data platform?

Hat tip: Avantrasara

UPDATE (11/19/09):  Wipro ties up with Intel for rural medical solutions

Access to Medical Data: Are PC Standards and PHRs (You) the Answer?

Dana Blankenhorn’s article Give medicine access to PC standards makes some good points about the medical device industry but (IMHO) misses the mark when trying to use PC standards and PHRs as models for working towards a solution.

I’ll get back to his central points in a minute. One thing I find fascinating is the knee-jerk reaction in the comments to even a hint of government control.  How on earth can someone jump from “industry standard” to a “march towards socialism”? We saw the same thing at this summer’s town hall meetings and in Washington a couple of weeks ago.  The whole health care debate is just mind boggling!

Anyway, let’s focus on the major points of the article. First:

Every industry, as its use of computing matures, eventually moves toward industry standards. It happened in law, it happened in manufacturing, it happened in publishing.

It has not happened, yet, in medicine.

Very true.  In the medical device world, connectivity and interoperability are hot topics. A couple of recent posts — Plug-and-Play Medicine and Medical Device Software on Shared Computers — point out the significant challenges in this area.  In particular, the development and adoption of standards is a very intensive and political process. But where’s the incentive for the industry to go through this? Dana’s comment addresses this (my emphasis):

The role I like best for government is in directing market incentives toward solutions, and not just to monopolies or bigger problems.

The reason health care costs jump every year is because market incentives cause them to. Those incentives must be changed, but the market won’t by itself because the market profits from them.

Only government can transform incentives.

Like it or not, this may to the only way to push the medical industry to do the right thing.  But those other industries didn’t need government intervention in order to create their standards.  Using PC (or other industry) standards as a model for facilitating medical data access just doesn’t work.  The health industry will have to dragged to the table kicking and screaming, and the carrot (or stick) will have to be large in order for them to come to a consensus.

Second, I don’t see the relationship between the use of PHRs and the promotion of standards.

By supporting PHRs, you support your right to your own data. You support liberating data from proprietary systems and placing it under industry standards.  You support integrating your health with the world of the Web, and the benefits such industry standards can deliver to you.

Taking responsibility for your own health data is great, but both Microsoft HealthVault and Google Health are proprietary systems.  Just because your data is on the Web doesn’t make it any more accessible.  And even if one of these PHRs did became an industry standard, it would have very little impact on how EMRs communicate with each other or medical devices in general.

There are no easy answers.

Liberate the Data!

Peter Neupert’s post Tear Down the Walls and Liberate the Data is worth reading. There are some Microsoft-centric comments, but a number of the linked articles are good and the overall message is correct (IMO anyway).

I might have tried to find a better analogy than ‘tear down this wall’, but that’s because I was never a Ronald Reagan fan.  Nevertheless, this gets across the primary point:

What’s of paramount importance is liberating the data and making it available for re-use in different contexts.

Two major ‘walls’ stand in the way of this:

  1. “it’s-my-data”
  2. “waiting-for-the-right-standards-set-by-government”

Both exist because of the perceived competitive advantages they provide to organizations and vendors.

Interoperability of data, or enabling data to become “liquid” would allow it to flow easily from system to system. These challenges are the same ones addressed by Adam Bosworth that I discussed in Dreaming of Flexible, Simple, Sloppy, Tolerant in Healthcare IT.

The technical issues are complicated, but I also believe that they not the primary reason that prevent  health IT systems from inter operating.  As Peter suggests, it would be good for HiTech dollars to be used to break down some of the more difficult barriers that prevent data liquidity.

The “proven model[s] for extracting and transforming data” do exist and there is no excuse not to use them.

After thinking about it some more, a more cautionary analogy may be The Exodus — Mosses leading the Israelites out of the Land of Egypt (“let my data go!”).  1) It took an act of God to part the Red Sea, and 2) after their dramatic escape they roamed the desert for 40 years. Let’s hope that health IT interoperability does not need devine intervention or suffer the same fate.

Exploring Cloud Computing Development

Cloud ComputingIt’s not easy getting your arms around this one. The term Cloud Computing has become a catch-all for a number of related technologies that have been used in enterprise-class systems for many years (e.g. grid computing, SOA, virtualization, etc.).

One of the primary concerns of cloud computing in Healthcare IT is privacy and security.  A majority of the content and comments in just about every article or blog post about CC, re: health data or not, deal with these concerns. I’m going to save that discussion for a future post.

I’m also not going to dig into the multitude of business and technical trade-offs of  these “cloud” options versus more traditional SaaS and other hybrid server approaches.  People write books about this stuff and there’s a flood of Internet content that slice and dice these subjects to death.

My purpose here is to provide an overview of cloud computing from a developers point-of-view so we can begin to understand what it would take to implement custom software in the cloud.  All of the major technical aspects are well covered elsewhere and I’m not going to repeat them here. I’m just going to note the things that I think were important to take into consideration when looking at each option.

Here‘s a simplified definition of Cloud Computing that’s easy to understand and will get us started:

Cloud computing is using the internet to access someone else’s software running on someone else’s hardware in someone else’s data center while paying only for what you use.

As a consumer, for example of a social networking site or PHR lets say, this definition fits pretty well.  There’s even an EMR that is  implemented in the cloud, Practice Fusion, that would fit this definition.

As a developer though,  I want it to be my software running in the cloud so I can make use of someone else’s infrastructure in a cost effective manner.  There are currently three major CC options.  Cloud Options – Amazon, Google, & Microsoft gives a good overview of these.

The Amazon and Google diagrams below were derived from here.

Amazon Web Services

Amazon Cloud Services

The Amazon development model involves building Zen virtual machine images that are run in the cloud by EC2. That means you build your own Linux/Unix or Windows operating system image and upload it to be  run in EC2. AWS has many pre-configured images that you can start with and customize to your needs. There are web service APIs (via WSDL) for the additional support services like S3, SimpleDB, and SQS.  Because you are building self-contained OS images, you are responsible for your own development and deployment tools.

AWS is the most mature of the CC options.  Applications that require the processing of huge amounts of data can make effective you of the AWS on-demand EC2 instances which are managed by Hadoop.

If you have previous virtual machine experience (e.g. with  Microsoft Virtual PC 2007 or VirtualBox) one of the main differences working with EC2 images is that they do not provide persistent storage. The EC2 instances have anywhere from 160 GB to 1.7 TB of attached storage but it disappears as soon as the instance is shut down. If you want to save data you have to use S3, SimpleDB, or your own remote storage server.

It seems to me that having to manage OS images along with applications development could be burdensome.  On the other hand, having complete control over your operating environment gives you maximum flexibility.

A good example of using AWS is here: How We Built a Web Hosting Infrastructure on EC2.

Google AppEngine

Google App Engine

GAE allows you to run Python/Django web applications in the cloud.  Google provides a set of development tools for this purpose. i.e. You can develop your application within the GAE run-time environment on our local system and deploy it after it’s been debugged and working the way you want it.

Google provides entity-based SQL-like (GQL) back-end data storage on their scalable infrastructure (BigTable) that will support very large data sets. Integration with Google Accounts allows for simplified user authentication.

From the GAE web site:  “This is a preview release of Google App Engine. For now, applications are restricted to the free quota limits.”

Microsoft Windows Azure

Microsoft Windows Azure

Azure is essentially a Windows OS running in the cloud.  You are effectively uploading and running  your ASP.NET (IIS7) or .NET (3.5) application.  Microsoft provides tight integration of Azure development directly into Visual Studio 2008.

For enterprise Microsoft developers the .NET Services and SQL Data Services (SDS) will make Azure a very attractive option.  The Live Framework provides a resource model that includes access to the Microsoft Live Mesh services.

Bottom line for Azure: If you’re already a .NET programmer, Microsoft is creating a very comfortable path for you to migrate to their cloud.

Azure is now in CTP and is expected to be released later this year.

UPDATE (4/27/09) Here’s a good Azure article:  Patterns For High Availability, Scalability, And Computing Power With Windows Azure.

Getting Started

All three companies make it pretty easy to get software up and running in the cloud. The documentation is generally good, and each has a quick start tutorial to get you going. I tried out the Google App Engine tutorial and had Bob in the Clouds on their server in about 30 minutes.

Bob's Guest Book

Stop by and sign my cloud guest book!

Misc. Notes:

  • All three systems have Web portal tools for managing and monitoring uploaded applications.
  • The Dr. Dobbs article Computing in the Clouds has a more detailed look at AWS and GAE development.

Which is Best for You?

One of the first things that struck me about these options is how different they all are.  Because of this, from a developer’s point-of-view I think you’ll quickly have a gut feeling about which one best matches your current skill sets and project requirements. The development components are just one piece of the selection process puzzle though. Which one you actually might end up using (it could very well be none) will also be based on all your other technical and business needs.

UPDATE (6/23/09): Here’s a good high level cloud computing discussion: Reflections on Executive Briefing Event: Cloud & RIA.  I like the phrase “Cloud Computing is Elastic” because it captures most the appealing aspects of the technology.  It’s no wonder Amazon latched on to that one — EC2.

Microsoft Research at PDC2008

Most of the press coming out of PDC2008 were all the cool new product development technology announcements. What you probably don’t appreciate is the depth and breadth of the Microsoft research effort that is really the foundation for many of these products.

The Day Three PDC Keynote by Rick Rashid (and others) is over 90 minutes long, but is worth a look.  It’s all fascinating stuff.

Hat tip: Dr. Neil’s Notes: Day Three PDC Keynote: Microsoft Research Magic

Google Health Launches: More PHR for the masses.

It’s finally here: Drumroll, Please: Google Health Launches!

If you use any of the Google applications (like Gmail), it’s just as easy as all the others:

Google Health

It will be interesting to see if this and HealthVault have an impact on how patients interact with their medical service providers. The privacy and security issues are certain to remain a significant barrier to adoption. Only time will tell.

UPDATE (5/23/08): See Delving Into Google Health’s Privacy Concerns

UPDATE (5/24/08): Apparently this Slashdot reference is “uninformed”: Why Google Health and HealthVault are not covered by HIPAA.

UPDATE (7/6/08): I ran across this post that talks about Microsoft HealthVault security: You Will Never Get It Microsoft. Here’s a quote from it:

Microsoft obviously think that I don’t know how HealthVault works. I don’t have to know how it works, I only know that it will and can be abused one day.



Twitter Updates