Can Intuitive Software Design Support Better Health?
By Scott Frederick
Biometric technology is the new “in” thing in healthcare, allowing patients to monitor certain health characteristics—blood pressure, weight, activity level, sleep pattern, blood sugar—outside of the healthcare setting. When this information is communicated with providers, it can help with population health management and long-term chronic disease care. For instance, when patients monitor their blood pressure using a biometric device and upload that information to their physician’s office, the physician can monitor the patient’s health remotely and tweak the care plan without having to physically see the patient.
For biometric technology to be effective, patients must use it consistently in order to capture a realistic picture of the health characteristics they are monitoring. Without regular use, it is hard to see if a reading is an anomaly or part of a larger pattern. The primary way to ensure consistent use is to design user-friendly biometric tools because it is human nature to avoid things that are too complicated, and individuals won’t hesitate to stop using a biometric device if it is onerous or complex.
Let’s look at an example.
An emerging growth area for healthcare biometrics is wireless activity trackers—like FitBit—that can promote healthier lifestyles and spur weight loss. About three months ago, I started using one of these devices to see if monitoring metrics like the number of steps I walked, calories I consumed and hours I slept would make a difference in my health.
The tool is easy-to-use and convenient. I can monitor my personal metrics any time, anywhere, allowing me to make real-time adjustments to what I eat, when I exercise, and so on. For instance, at any given time, I can tell how many steps I’ve taken and how many more I need to take to meet my daily fitness goal. This shows me whether I need to hit the gym on the way home from work or whether my walk at lunch was sufficient. I can even make slight changes to my routine, choosing to stand up during conference calls or take the stairs instead of the elevator.
I download my data to a website, which provides easy-to-read and customizable dashboards, so I can track overall progress. I find I check that website more frequently than I look at Facebook or Twitter.
Now, imagine if the tool was bulky, slow, cumbersome and hard to navigate. Or the dashboard where I view my data was difficult to understand. I would have stopped using it awhile ago—or may not have started using it in the first place.
Like other hot technology, there are several wireless activity trackers infiltrating the market, each one promising to be the best. In reality, only the most well-designed applications will stand the test of time. These will be completely user-centric, designed to easily and intuitively meet user needs.
For example, a well-designed tracker will facilitate customization so users can monitor only the information they want and change settings on the fly. Such a tool will have multiple data entry points, so a user can upload his or her personal data any time and from anywhere. People will also be able to track their progress over time using clear, easy-to-understand dashboards.
Going forward, successful trackers may also need to keep providers’ needs in mind. While physicians have hesitated to embrace wireless activity monitors—encouraging patients to use the technology but not leveraging the data to help with care decisions—that perspective may be changing. It will be interesting to see whether physicians start looking at this technology in the future as a way to monitor their patients’ health choices. Ease of obtaining the data and having it interface with existing technology will drive provider use and acceptance.
While biometric tools are becoming more common in healthcare and stand to play a major role in population health management in the future, not every tool will be created equal. Those designed with the patient and provider in mind will rise to the top and improve the overall health of their users.
Scott Frederick, RN, BSN, MSHI is director of clinical insight for PointClear Solutions of Atlanta, GA.
Addressing Data Quality in the EHR
By Greg Chittim
What if you found out that you might have missed out on seven of your 22 ACO performance measures, not because of your actual clinical and financial performance, but because of the quality of data in your EHRs? It happens, but it’s not an intractable problem if you take a systematic approach to understanding and addressing data quality in all of your different ambulatory EHRs.
In HIStalk’s recent coverage of HIMSS14, an astute reader wrote:
Several vendors were showing off their “big data” but weren’t ready to address the “big questions” that come with it. Having dealt with numerous EHR conversions, I’m keenly aware of the sheer magnitude of bad data out there. Those aggregating it tend to assume that the data they’re getting is good. I really pushed one of the major national vendors on how they handle data integrity and the answers were less than satisfactory. I could tell they understood the problem because they provided the example of allergy data where one vendor has separate fields for the allergy and the reaction and another vendor combines them. The rep wasn’t able to explain how they’re handling it even though they were displaying a patient chart that showed allergy data from both sources. I asked for a follow up contact, but I’m not holding my breath.
All too often as the HIT landscape evolves, vendors and their clients are moving too quickly from EHR implementation to population health to risk-based contracts, glossing over (or skipping entirely) a focus on the quality of the data that serves as the foundation of their strategic initiatives. As more provider organizations adopt population health-based tools and methodologies, a comprehensive, integrated, and validated data asset is critical to driving effective population-based care.
Health IT maturity can be defined as four distinct steps:
High-quality data is a key foundational piece that is required to manage a population and drive quality. When the quality of data equals the quality of care physicians are providing, one can leverage that data as an asset across the organization. Quality data can provide detailed insight that allows pinpointing opportunities for intervention — whether it’s around provider workflow, data extraction, or patient follow-up and chart review. Understanding the origins of compromised data quality help recognize how to boost measure performance, maximize reimbursements, and lay the foundation for effective population health reporting.
It goes without saying that reporting health data across an entire organization is not an easy task. However, there are steps that organizations must take to ensure they are extracting sound data from their EHR systems.
Outlined below are the key issues that contribute to poor data quality impacting population health programs, how they are typically resolved, and more optimal ways organizations can resolve them.
Variability across disparate EHRs and other data sources
EHRs are inconsistent. Data feeds are inconsistent. Despite their intentions, standardized message types such as HL7 and CCDs still have a great deal of variability among sources. When they meet the letter of national standards, they rarely meet the true spirit of those standards when you try to use.
Take diagnoses, for example. Patient diagnoses can often be recorded in three different locations: on the problem list, as an assessment, and in medical history. Problem lists and assessments are both structured data, but generally only diagnoses recorded on the problem list are transported to the reports via the CCD. This translates to underreporting on critical measures that require records of DM, CAD, HTN, or IVD diagnoses. Accounting for this variability is critical when mapping data to a single source of truth.
Standard approach: Most organizations try to use consistent mapping and normalization logic across all data sources. Validation is conducted by doing sanity checks, comparing new reports to old.
Best practice approach: To overcome the limitations of standard EHR feeds like the CCD, reports need to pull from all structured data fields in order to achieve performance rates that reflect the care physicians are rendering– either workflow needs to be standardized across providers or reporting tools need to be comprehensive and flexible in the data fields they pull from.
The optimal way to resolve this issue is to tap into the back end of the EHR. This allows you to see what data is structured vs. unstructured. Once you have an understanding of the back-end schema, data interfaces and extraction tools can be customized to pull data where it is actually captured, as well as where it should be captured. In addition, validation of individual data elements needs to happen in collaboration with providers, to ensure completeness and accuracy of data.
Variability in provider workflows
EHRs are not perfect and providers often have their own ways of doing things. What may be optimal for the EHR may not work for the providers or vice versa. Within reason, it is critical to accommodate provider workflows rather than forcing them into more unnatural change and further sacrificing efficiency.
Standard approach: Most organizations ignore this and go to one extreme or another: (1) use consistent mapping and normalization logic across all data sources and user workflows, making the assumption that all providers use the EHR consistently, or (2) allowing workflows to dictate all and fight the losing battle to make the data integration infinitely adaptable. Again, validation is conducted using sanity checks, comparing new reports to old.
Best practice approach: Understand how each provider uses the system and identify where the provider is capturing all data elements. Building in a core set of workflows and standards dictated by an on-the-ground clinical advisory committee, with flexibility for effective variations is critical. With a standard core, data quality can be enhanced by tapping into the back end of the EHR to fully understand how data is captured as well as spending time with care teams to observe their variable workflows. To avoid disruption in provider workflows, interfaces and extraction tools can be configured to map data correctly, regardless of how and where it is captured. Robust validation of individual data elements needs to happen in collaboration with providers to ensure completeness and accuracy of data (that is, the quality of the data) matches the quality of care being delivered.
Build provider buy-in/trust in system and data through ownership
If providers do not trust the data, they will not use population health tools. Without these tools, providers will struggle to effectively drive proactive, population-based care or quality improvement initiatives. Based on challenges with EHR implementation and adoption over the last decade, providers are often already skeptical of new technology, so getting this right is critical.
Standard approach: Many organizations simply conduct data validation process by doing a sanity test comparing old reports to new. Reactive fixes are done to correct errors in data mapping, but often too late, after provider trust has been lost in the system.
Best practice approach: Yet again, it is important to build out a collaborative process to ensure every single data element is mapped correctly. First meetings to review data quality usually begin with a statement akin to “your system must be wrong — there’s no way I am missing that many patients.” This is OK. Working side by side with the providers to ensure they understand where data is coming from and how to modify both workflow and calculations ensure that they are confident that reports accurately reflect the quality of care they are rendering. This confidence is a critical success factor to the eventual adoption of these population health tools in a practice.
Missed incentive payments under value-based reimbursement models
An integrated data asset that combines data from many sources should always add value and give meaningful insight into the patient population. A poorly mapped and validated data asset can actually compromise performance, lower incentive reimbursements, and ultimately result in a negative ROI.
Standard approach: A lackluster data validation process can result in lost revenue opportunities, as data will not accurately reflect the quality of care delivered or accurately report the risk of the patient population.
Best practice approach: Using the previously described approach when extracting, mapping, and validating data is critical for organizations that want to see a positive ROI in their population health analytics investments. Ensuring data is accurate and complete will ensure tools represent the quality of care delivered and patient population risk, maximizing reimbursement under value-based payments.
We have worked with a sample ACO physician group of over 50 physicians to assess the quality of data being fed from multiple EHRs within their system into an existing analytics platform via CCDs and pre-built feeds. Based on an assessment of 15 clinically sensitive ACO measures, it was discovered that the client’s reports were under-reporting on 12 of the 15 measures, based only on data quality. Amounts were under-reported by an average of 28 percentage points, with the maximum measure being under-reported by 100 percentage points.
Reports erroneously reported that only six of the 15 measures met 2013 targets, while a manual chart audit revealed that 13 of the 15 measures met 2013 targets, indicating that data was not being captured, transported, and reported accurately. By simply addressing these data quality issues, the organization could potentially see additional financial returns through quality incentive reimbursements as well as a reduced need for labor-intensive intensive chart audits.
As the industry continues to shift toward value-based payment models, the need for an enterprise data asset that accurately reflects the health and quality of care delivered to a patient population is increasingly crucial for financial success. Providers have suffered enough with drops in efficiency since going live on EHRs. Asking them to make additional significant changes in their daily workflows to make another analytics tool work is not often realistic.
Analytics vendors need to meet the provider where they are to add real value to their organization. Working with providers and care teams not only to validate integrity of data, but to instill a level of trust and give them the confidence they need to adopt these analytics tools into their everyday workflows is extremely valuable and often overlooked. These critical steps allow providers to begin driving population-based care and quality improvement in practices, positioning them for success in the new era of healthcare.
Greg Chittim is senior director of Arcadia Healthcare Solutions of Burlington, MA.
The views and opinions expressed in this blog are mine personally and are not necessarily representative of current or former employers.
How Snow White Changed My Life
OK, life change is a stretch, but Snow and some of her peer princesses did remind me of a critical aspect of leadership—creating special moments. In the case of Disney, it’s “where dreams come true.” For my Starbucks aficionados, it’s, “Handcrafted beverages are the secret to making life better.”
Five years ago, I added “create perfect moments” to my personal strategic plan. It’s one technique to help ensure “creating perfect moments” moves from bench to bedside. In the big things of my life, this has worked well, but not the common everyday stuff of earth.
While in Orlando recently, I spent time exploring Disney’s Epcot. Just for fun — and to make my wife and 20-year-old daughter smile — I decided to grab a photo op with Snow White.
Was my pride ever challenged! There I was, sandwiched between animated toddlers and star-struck preteens, in line to take a pic with Ms. Purity herself. Seemed everyone was dressed like a princess except me. I stood close to one toddler hoping passersby would think I was part of her family. Heaven forbid someone I knew might see me standing in line at Disney for a personal princess pic.
My turn came. I sheepishly held my arm out for Snow White. My friend took the pic.
I was ready to run, but Snow would not let me go. Help! She turned, looked me in the eye, and engaged me in conversation. I was pulling away, but she kept me there. It was longer than a moment, but not excessive, maintaining eye contact the entire time. As if someone just discovered my hand in the cookie jar, I was about to break out in a nervous sweat.
I texted the pic to my wife and daughter and they both replied ROTFL. So when I saw Sleeping Beauty, I stepped in line again.
This time, I carefully observed all the interactions between the princess and her devotees. Miss Beauty held eye contact with every fan and engaged in brief conversation.
My turn came, and though I tried to pull away, she clung to my arm until we talked. Awkward, yes, but so enlightening. Ditto with Belle, Cinderella, and last but not least, Ariel. They were indeed making dreams come true for their fans. They made me feel important.
How can we take something as simple and yet profound as a Disney princess engagement formula and put it into practice ourselves? How can we allow this to become a natural part of who we are?
As leaders, we are so rushed. I preach to myself here. We walk past our staff with nary an acknowledgement. When we do stop to talk, we are thinking about the meeting we are headed to.
On one hand, we claim that the right people in the right places are our most valuable assets. But do we give them the gift of our time, fully present, even for just a minute? This proves a contradiction in our leadership.
Since my return from Disney, I’ve been doubling down on creating special moments, this time with my staff. I am making sure every interaction, however brief, is meaningful. Eye contact. Genuine interest. While the other person may be rushed, I will remind myself that my agenda is their agenda, and my role as a leader is to serve them. True, not every person will want the time, but for those who do, I am there.
Before the end of my final day at Disney, I was looking for the next princess. Why? Because I enjoyed the way they made me feel. Special. If a princess can do this for strangers, we can do it for those we serve. Pics or no pics.
Create special moments.
Ed Marx is a CIO currently working for a large integrated health system. Ed encourages your interaction through this blog. Add a comment by clicking the link at the bottom of this post. You can also connect with him directly through his profile pages on social networking sites LinkedIn and Facebook and you can follow him via Twitter — user name marxists.
Jim Prekop is president and CEO of TeraMedica of Milwaukee, WI.
Tell me about yourself and the company.
I’ve been in health IT for about 30 years. The last 10 have been with TeraMedica. Before that, I was in the EMR space and companies like PeopleSoft and Dun & Bradstreet software.
TeraMedica is middleware. The industry term is vendor-neutral archive. We collect clinical objects and are responsible for making them available to the source system, but also making them available in a patient-centric view to additional consumers of that data, whether they’re outside in institutions, exchanges, or new technology that gets adopted by the provider. We perform that role in the healthcare architecture.
How has the unbundling of PACS from single-solution vendors changed the demand for vendor-neutral archives and what’s the end result for the provider and the patient?
It’s a natural progression. With systems, historically, the new idea is a more or less a closed-loop answer. It’s the same way with accounting systems going back decades.
What was a box has now become a layer in the architecture, the process of acquiring and managing an image and then making it available down the road to new consumers or later in my lifetime. The solution has had to evolve. The VNA, or the ability to seamlessly have the interaction with departmental activity but yet be the conduit into the enterprise, it’s a natural progression. It’s not to say that PACS is bad, just that the focus going forward on PACS will be different, just as the responsibility for the VNA will change over time as well.
What about universal viewers?
The universal viewer is interesting. They’re approaching this through the lens of the physician, whereas the VNA approaches it from the infrastructure up.
The advantage for the enterprise viewer is that they can combine data from multiple sources. But the other thing that has to be kept in mind is that there is response time and there is certainty that is needed in what is delivered to the enterprise viewer. You get into a federated discussion of going after 20 different data sources, combining that answer, and then delivering it in one view to the clinician versus the ability to have all of that patient matching resolved by the VNA. It’s one-stop shopping. It goes to any consumer of the VNA.
We see the consumers being an EMR. We see the consumers being an enterprise viewer. Going forward as more adoption comes into the United States, it will be different exchanges that imaging will become part of. So to us, it’s just a consumer. We optimize its ability to be confidently assured that they’ve asked for and gotten the right information and that all the information is there. If you have a federated view and make a request and one of those systems is down, you might not get the answer.
Enterprise viewer implies that there’s behind the scenes fetching going on that then presents a unified view, as opposed to the VNA where it’s actually stored in a single system.
Yes. It’s already stored and normalized and you’re having one conversation behind the scenes.
Unless somebody’s invented something new in IT that I haven’t seen, you pretty much have to ask the same question across multiple systems or go to some sort of index and find out all the Jim Prekops and then go and find out where they’re located, go get them, and then present it to me in an organized way. Can those enterprise viewers do that? Absolutely, and we have great partners in that space. Is it the best experience for the provider or the clinician? Maybe not.
What are the optimal ways to integrate a variety of images into Epic or Cerner?
I call it a landing page. EMRs address all the departments in the organization and rightfully so. But if I want to go look at all the different clinical objects that Jim Prekop created in a facility, chances are the links to that information are within various locations within the EMR.
One of the advantages that TeraMedica brings to the table to leverage the investment that the provider has in the EMR is to give a patient-centered view of all the clinical objects, should they want that. That’s an option in our system. We can be tied to a report and just show that image, or we can present a complete inventory of what we have in the VNA, so that in one location, a clinician can see things that might be related to other departments. I don’t necessarily have to navigate over to that section of the EMR to see those objects.
It’s probably important to note that all images are objects but not all objects are images. Are you seeing demands for new object types?
Absolutely. When I first got here, I had to get an education on DICOM and all the nuances and it was a big education. But not everything is DICOM when it comes to clinical objects.
Our customers asked us very early to not just manage DICOM. It’s a wonderful thing and is the heavy lifting in our business. But to be truly patient-centric, you have to address all different types of file types, whether it be JPEGs, MPEGs, PDFs, a Word document, or in the case of cancer care, lots of calculations are done using Excel and other types of planning systems.
To represent that an image is just a DICOM object is not fair. It’s usually one of the arguments when you try and decide what a VNA really is. There are lots of folks that manage DICOM and they do a good job, but they declare themselves as the VNA. That doesn’t meet our definition of a VNA.
What’s the distinction between storing non-DICOM data in its native format instead of using a DICOM wrapper?
Unlike other industries where you can create data marts and if there’s a problem you just snap another copy of the data, we’re into terabytes and hundreds of terabytes of data. As you acquire that information as the VNA, you have to be clinically responsible to the source system. If I go get a PDF of Jim Prekop from a clinical system and I wrap it in DICOM and that system wants it back, I either have to create duplicate storage — which is not cost productive — or I have to be able to unwrap it from that DICOM and enter that as a PDF to that source system.
The overhead of doing that simply doesn’t work and it doesn’t scale. To believe that you have to wrap everything in DICOM so it follows how your system works … I would suggest you have the wrong system if it only works with DICOM.
A well-known VNA consultant who comes from a PACS mentality is adamant that everything should be wrapped in DICOM. We needed to get him to sign an updated non-disclosure agreement, so I had my engineers wrap our NDA in DICOM before I sent it to him. His asked me what I had sent him since he operates on a Macintosh that doesn’t understand the file type, which is a .UCM. He didn’t even recognize that I had sent him a DICOM file. He didn’t understand that he was essentially justifying the reason why we believe that it’s DICOM and non-DICOM.
Who are your main competitors and how do you differentiate your product from theirs?
Since the VNA term was adopted — I prefer Vendor-Neutral Architecture — lots of folks put their hat into the game. As you would expect, a lot of PACS vendors have begun to open up and allow multiple DICOM systems to enter data in there.
It’s usually TeraMedica and Acuo that end up being the finalists in any evaluation. There are some other ones that are out there that do some of the things that we do. There’s some newcomers — Mach7 is out there, but I think they have more activity outside the US than they do within the US. But there are others that are coming into the space, and rightfully so. It’s a competitive market.
Hospitals acquiring medical practices and each other have left them trying to figure out how to get their systems to talk to each other. Is that true of imaging systems or other systems that would populate a VNA?
There’s two aspects of that. We’re having organizations that are buying us because they’re strategically positioning themselves to acquire other entities. They know that they can’t rip out those clinical systems, so they will use us as part of their strategy to get control of the data and share it across the enterprise.
As far as the other way, we have sites that are established either because of acquisitions or because of differences on campuses that have multiple EMRs. Our technology allows, again using myself as the example, Jim Prekop to be referenced, and if I know the request is coming from Epic, I’ll behave one way to put it properly in Epic. At the same time, I can put it into Cerner. There’s one source of the truth.
One of the value propositions that we bring as a VNA is that we can identify consumers and react accordingly. We can also respond to multiple consumers, but yet give them the exact data that they’re looking at, whether they come in through the physician’s office with one EMR or they come in through the hospital with another EMR. It’s one source of the truth with multiple consumers.
Where do you see the company going in the next three to five years?
I think it’s based around being a good partner with our customers and bringing to them more use cases, more managing the data. As you would expect, we can sit behind a PACS, but the thing about VNAs is we’ve had to come around the curtain. We’ve always considered doing the plumbing behind the scenes. But now we’re very active in different departmental workflows.
We’re getting involved with our iPad app, as an example, in departments like wound care and dermatology, where the clinicians are actually interacting with our software and we are part of the EMR, but the clinician doesn’t even know we’re there. A lot of times when someone says, “I didn’t know you were there,” that’s a bad thing. For us, that’s a good thing, because we want seamless integration into these different systems. I can see us doing more of it.
I can see us taking responsibilities for more functions of a generic nature in the provider space so that they can optimize the platform that they’ve invested in. Clearly the leading investment is the EMR. But the VNA is also a strategic investment, and we need to do more for them when it comes to clinical workflow.
If you’ve ever traveled to a country that doesn’t speak your native tongue, you can appreciate the importance of basic communication. If you learn a second language to the degree that you’re adding nuance and colloquialisms, you’ve experienced how much easier it is to explain a point or to get answers you need. What if you’re expected to actually move to that foreign country under a strict timeline? The pressure is on to get up to speed. The same can be said for learning the detailed coding language of ICD-10.
The healthcare industry has been preparing in earnest to move from ICD-9 coding to the latest version of the international classification of diseases. People have been training, testing and updating information systems, essentially packing their bags to comply with the federal mandate to implement ICD-10 this October — but the trip was postponed. On April 1, President Barrack Obama signed into law a bill that includes an extension for converting to ICD-10 until at least Oct. 1, 2015. What does this mean for your ICD-10 travel plans?
Despite the unexpected delay, you’ll be living in ICD-10 country before you know it. With at least another year until the deadline, the timing is just right to start packing and hitting the books to learn the new codes and to prepare your systems. For those who have a head start, your time and focus has not gone to waste, so don’t throw your suitcases back into the closet. The planning, education and money involved in preparation for the ICD-10 transition doesn’t dissolve with the delay – you’ve collected valuable tools that will be put to use.
Although many people, including myself, are disappointed in the change, we need to continue making progress toward the conversion; learning and using ICD-10 will enable the United States to have more accurate, current and appropriate medical conversations with the rest of the world. Considering that it is almost four decades old, there is only so much communication that ICD-9 can handle; some categories are actually full as the number of new diagnoses continues to grow. ICD-9 uses three to five numeric characters for diagnosis coding, while ICD-10 uses three to seven alphanumeric characters. ICD-10 classifications will provide more specific information about medical conditions and procedures, allowing more depth and accuracy to conversations about a patient’s diagnosis and care.
Making the jump to ICD-10 fluency will be beneficial, albeit challenging. In order to study, understand and use ICD-10, healthcare organizations need to establish a learning system for their teams. The Breakaway Group, A Xerox Company, provides training for caregivers and coders that eases learning challenges, such as the expanded clinical documentation and new code set for ICD-10. Simply put, there are people can help with your entire ICD-10 travel itinerary, from creating a checklist of needs to planning a successful route.
ICD-10 is the international standard, so the journey from ICD-9 codes to ICD-10 codes will happen. Do not throw away your ICD-10 coding manuals and education materials just yet. All of these items will come in handy to reach the final destination: ICD-10.
Xerox is a sponsor of the Breakaway Thinking series of blog posts.
This is a guest blog by Nial Toner of PathXL, a vendor of cloud-based digital pathology systems. I asked him to discuss the benefits of cloud computing in digital pathology and barriers to its deployment. There will be some emphasis placed on digital pathology at the upcoming Pathology Informatics Summit 2014 (see: Digital Pathology Well Represented at Pathology Informatics Summit 2014)--BAF
In digital pathology, cloud computing can help to deliver cost effective healthcare and also help to manage the growing amount of data that is generated by the technology. Cloud computing provides many benefits but also some drawbacks. The benefits of cloud computing in digital pathology are the following:
Despite these benefits, some reservations and barriers to using cloud technology in digital pathology persist and include:
While the benefits are substantial, cloud computing has yet to make any major inroads in pathology. Despite this, cloud computing in support of digital pathology is increasingly being used for medical education and in research settings. The future for cloud technology does look bright and the value of cloud computing for the healthcare industry has been predicted to reach $5.4 billion by 2017. We are all increasingly adapting to a mobile world and digital pathology will make a major contribution to this goal.
The FDA, ONC, and FCC will co-host a free three-day public workshop at NIST’s campus in Gaithersburg, MD from May 13-15. The event will provide experts and stakeholders an opportunity to provide input on the recently published FDASIA health IT report.
CMS finally acknowledges the ICD-10 delay in a new post on its ICD-10 readiness website that says, "CMS is examining the implications of the ICD-10 provision and will provide guidance to providers and stakeholders soon."
A study at Beth Israel Deaconess Medical Center (MA) that compared the quality scores of 540 physicians who achieved MU with those of 318 physicians who did not finds that adoption of Meaningful Use does not correlate with improved quality.
A local paper covers the launch of two competing health information exchanges in Oklahoma and discusses the impact the competition will have on the overall sustainability of the project.
A post-stroke rehabilitation system integrating robotics, VR and high-resolution EEG imaging.
IEEE Trans Neural Syst Rehabil Eng. 2013 Sep;21(5):849-59
Authors: Steinisch M, Tana MG, Comani S
We propose a system for the neuro-motor rehabilitation of upper limbs in stroke survivors. The system is composed of a passive robotic device (Trackhold) for kinematic tracking and gravity compensation, five dedicated virtual reality (VR) applications for training of distinct movement patterns, and high-resolution EEG for synchronous monitoring of cortical activity. In contrast to active devices, the Trackhold omits actuators for increased patient safety and acceptance levels, and for reduced complexity and costs. VR applications present all relevant information for task execution as easy-to-understand graphics that do not need any written or verbal instructions. High-resolution electroencephalography (HR-EEG) is synchronized with kinematic data acquisition, allowing for the epoching of EEG signals on the basis of movement-related temporal events. Two healthy volunteers participated in a feasibility study and performed a protocol suggested for the rehabilitation of post-stroke patients. Kinematic data were analyzed by means of in-house code. Open source packages (EEGLAB, SPM, and GMAC) and in-house code were used to process the neurological data. Results from kinematic and EEG data analysis are in line with knowledge from currently available literature and theoretical predictions, and demonstrate the feasibility and potential usefulness of the proposed rehabilitation system to monitor neuro-motor recovery.
Brain-computer interfaces: a powerful tool for scientific inquiry.
Curr Opin Neurobiol. 2014 Apr;25C:70-75
Authors: Wander JD, Rao RP
Abstract. Brain-computer interfaces (BCIs) are devices that record from the nervous system, provide input directly to the nervous system, or do both. Sensory BCIs such as cochlear implants have already had notable clinical success and motor BCIs have shown great promise for helping patients with severe motor deficits. Clinical and engineering outcomes aside, BCIs can also be tremendously powerful tools for scientific inquiry into the workings of the nervous system. They allow researchers to inject and record information at various stages of the system, permitting investigation of the brain in vivo and facilitating the reverse engineering of brain function. Most notably, BCIs are emerging as a novel experimental tool for investigating the tremendous adaptive capacity of the nervous system.
Android Wear will show you info from the wide variety of Android apps, such as messages, social apps, chats, notifications, health and fitness, music playlists, and videos.
It will also enable Google Now functions — say “OK, Google” for flight times, sending a text, weather, view email, get directions, travel time, making a reservation, etc..
Google says it’s working with several other consumer-electronics manufacturers, including Asus, HTC, and Samsung; chip makers Broadcom, Imagination, Intel, Mediatek and Qualcomm; and fashion brands like the Fossil Group to offer watches powered by Android Wear later this year.
If you’re a developer, there’s a new section on developer.android.com/wear focused on wearables. Starting today, you can download a Developer Preview so you can tailor your existing app notifications for watches powered by Android Wear.
A Hybrid Brain Computer Interface System Based on the Neurophysiological Protocol and Brain-actuated Switch for Wheelchair Control.
J Neurosci Methods. 2014 Apr 5;
Authors: Cao L, Li J, Ji H, Jiang C
BACKGROUND: Brain Computer Interfaces (BCIs) are developed to translate brain waves into machine instructions for external devices control. Recently, hybrid BCI systems are proposed for the multi-degree control of a real wheelchair to improve the systematical efficiency of traditional BCIs. However, it is difficult for existing hybrid BCIs to implement the multi-dimensional control in one command cycle.
NEW METHOD: This paper proposes a novel hybrid BCI system that combines motor imagery (MI)-based bio-signals and steady-state visual evoked potentials (SSVEPs) to control the speed and direction of a real wheelchair synchronously. Furthermore, a hybrid modalities-based switch is firstly designed to turn on/off the control system of the wheelchair.
RESULTS: Two experiments were performed to assess the proposed BCI system. One was implemented for training and the other one conducted a wheelchair control task in the real environment. All subjects completed these tasks successfully and no collisions occurred in the real wheelchair control experiment.
COMPARISON WITH EXISTING METHOD(S): The protocol of our BCI gave much more control commands than those of previous MI and SSVEP-based BCIs. Comparing with other BCI wheelchair systems, the superiority reflected by the index of path length optimality ratio validated the high efficiency of our control strategy.
CONCLUSIONS: The results validated the efficiency of our hybrid BCI system to control the direction and speed of a real wheelchair as well as the reliability of hybrid signals-based switch control.
Glyph looks like a normal headset and operates like one, too. That is, until you move the headband down over your eyes and it becomes a fully-functional visual visor that displays movies, television shows, video games or any other media connected via the attached HDMI cable.
Using Virtual Retinal Display (VRD), a technology that mimics the way we see light, the Glyph projects images directly onto your retina using one million micromirrors in each eye piece. These micromirrors reflect the images back to the retina, producing a reportedly crisp and vivid quality.
I’ve written regularly about the need for secure text messaging in healthcare. I can’t believe that it was two years ago that I wrote that Texting is Not HIPAA Secure. Traditional SMS texting on your cell phone is not HIPAA secure, but there are a whole lot of alternatives. In fact, in January I made the case for why even without HIPAA Secure Text Messaging was a much better alternative to SMS.
Those that know me (or read my byline at the end of each article) know that I’m totally bias on this front since I’m an adviser to secure text message company, docBeat. With that disclaimer, I encourage all of you to take a frank and objective look at the potential for HIPAA violations and the potential benefits of secure text over SMS and decide for yourself if there is value in these secure messaging services. This amazing potential is why I chose to support docBeat in the first place.
While I’ve found the secure messaging space really interesting, what I didn’t realize when I started helping docBeat was how many parts of the healthcare system could benefit from something as simple as a secure text message. When we first started talking about the secure text, we were completely focused on providers texting in ambulatory practices and hospitals. We quickly realized the value of secure texting with other members of the clinic or hospital organization like nurses, front desk staff, HIM, etc.
What’s been interesting in the evolution of docBeat was how many other parts of the healthcare system could benefit from a simple secure text message solution. Some of these areas include things like: long term care facilities, skilled nursing facilities, Quick Care, EDs, Radiology, Labs, rehabilitation centers, surgery centers, and more. This shouldn’t have been a surprise since the need to communicate healthcare information that includes PHI is universal and a simple text message is often the best way to do it.
The natural next extension for secure messaging is to connect it to patients. The beautiful part of secure text messaging apps like docBeat is that patients aren’t intimidated by a the messages they receive from docBeat. The same can’t be said for most patient portals which require all sorts of registration, logins, forms, etc. Every patient I know is happy to read a secure text message. I don’t know many that want to login to a portal.
Over the past couple years the secure text messaging tide has absolutely shifted and there’s now a land grab for organizations looking to implement some form of secure text messaging. In some ways it reminds me of the way organizations were adopting EHR software a few years back. However, we won’t need $36 billion to incentivize the adoption of secure text message. Instead, market pressures will make it happen naturally. Plus, with ICD-10 delayed another year, hopefully organizations will have time to focus on small but valuable projects like secure text messaging.
I recently chaired a couple of conferences and my next HealthIMPACT event is coming up later this month in NYC. At each one of the events and many times a year via twitter and e-mail I am asked whether the Direct Project is successful, worth implementing in health IT projects, and if there are many people sending secure messages using Direct. To help answer these questions, I reached out to Rachel A. Lunsford, Director of Product Management at Amida Technologies. Amida has amassed an impressive team of engineers to focus on health IT for patient-centered care so their answer will be well grounded in facts. Here’s what Rachel said when I asked whether Direct is a myth or if it’s real and in use:
Despite wide adoption in 44 States, there is a perception that Direct is not widely used. In a recent conversation, we discussed a potential Direct secure messaging implementation with a client when they expressed concern about being a rare adopter of Direct messaging. While the team reassured them that their organization would in fact be joining a rich ecosystem of adopters, they still asked us to survey the market.
In 2012, the Officer of the National Coordinator for Health Information Technology (ONC) awarded grants to State Health Information Exchanges to further the exchange of health information. There are two primary ways to exchange information: directed and query-based. ‘Directed’ exchange is what it sounds like – healthcare providers can send secure messages with health information attached to other healthcare providers that they know and trust. The most common type of ‘Directed’ exchange is Direct which is a secure, scalable, standards-based way to send messages. Query-based is a federated database or central repository approach to information exchange which is much harder to implement and growth in this area is slower. Thanks in part to the grants and also in part to the simplicity of the Direct protocol, 44 States have adopted Direct and widely implemented it. And yet the myth persists that Direct is not well adopted or used.
As with other new technologies, it may be hard to see the practical applications. When Edison and Tesla were dueling to find out which standard – direct or alternating current – would reign supreme, many were unsure if electricity would even be safe enough, never mind successful enough, to replace kerosene in the street lamps. It was impossible for people to foresee a world where many live in well-lit homes on well-lit streets, and none could have imagined using tools like the computer or the Internet. Thankfully, the standards debate was sorted out and we continue to benefit from it today.
There are two groupings of data we can look towards for more detail on use of Direct. The first are the States themselves; they self-report transaction and usage statistics to the ONC. It was reported in the third quarter of 2013 that the following were actively exchanging some 165 million ‘Directed’ messages:
Another organization collecting data on Direct implementation is DirectTrust.org. Charged by ONC, DirectTrust.org oversees development of the interoperability framework and rules used by Direct implementers, works to reduce implementation costs, and remove barriers to implementation. Additionally, DirectTrust supports those who want to serve as sending and receiving gateways known as health information service providers (HISPs). By DirectTrust.org’s count, the users number well over 45,000 with at least 16 organizations accredited as HISPs. Further, over two million messages have been exchanged with the roughly 1,500 Direct-enabled sites. With Meaningful Use encouraging the use of Direct, we can expect even more physicians and healthcare organizations to join in.
As more doctors are able to exchange records, everyone will benefit. When a provider can receive notes and records from other providers to see a fuller, more complete view of her patient’s health, we have a greater possibility of lowering healthcare costs, improving health outcomes, and saving lives. Once we open up the exchange to patients through things like the Blue Button personal health record, the sky is the limit.
Editor’s Note: The following is an update to a previous EMR and HIPAA blog post titled “EMR Companies Holding Practice Data for “Ransom”.” In this update, James Summerlin (aka “JamesNT”) offers an update on EHR vendors willingness to let providers access their EHR data.
Over the years I have been approached with questions by several solo docs and medical groups about things such as the following:
And there have been plenty of times I’ve had to give answers to those questions that were not favorable. In many cases, it was with some online EMR or PM and the fact that I could not get to the database and the vendor refused to export a copy to me or the vendor wanted thousands of dollars for the export. With the on-premises PM and EMR systems, getting to the data was a matter of working my way around whatever database was being used and figuring out what table had what data. Although working with an on-premises PM or EMR may sound easier, it too often isn’t. The on-premises guys have some tricks up their sleeves to keep you away from your data such as password protecting the database and, in some cases, flat out threatening legal action.
A few years back, I wrote a post on a forum about my thoughts on how once you entered your data into a PM or EMR, you may never get it back. You can see John Lynn’s blog post on that here.
My being critical of EMR and PM software vendors is nothing new. I’ve written several posts on forums and blogs, even articles in BC Advantage Magazine, about how hard it can be to deal with various EMR and PM systems. Much of the, at times, downright contemptuous attitudes many PM and EMR vendors have towards their own clients can be very harmful. Let’s consider three aspects:
In situations like those above, the best way to resolution is for the practice to perhaps obtain its own technical talent and build its own tools to extend the capabilities of the data contained within the various databases and repositories it may have such as the databases of the PM and EMR. Unfortunately, as I have reported before, most PM and EMR systems lock up the practice’s data such that it is unobtainable.
At long last; however, there appears to be a light at the end of the tunnel that doesn’t sound like a train. Some of the EMR systems that doctors use are beginning to realize that creating a turtle shell around a client’s data, in the long run, doesn’t do the client nor the PM/EMR vendor any good. One such EMR I’ve been working with for a long time is Amazing Charts. Amazing Charts has found itself in a very unique situation in that many of its clients are actually quite technical themselves or have no problem obtaining the technical talent they need to bend the different systems in their practices to their will. The idea of having three or four databases, each being an island unto itself, is not acceptable to this adventurous lot. They want all this data pooled together so they can make real business decisions.
Amazing Charts; therefore, has decided to be more open regarding data access. Read only access to the Amazing Charts database is soon to be considered a given by the company itself. Write access, of course, is another matter. Clients will have to prove, and rightly so, that they won’t go spelunking through the database making changes that do little more than rack up tech-support calls. Even with the caution placed on write access this is a far jump above and beyond the flat out “NO” any other company will give you for access to their database. I consider this to be a great leap forward for Amazing Charts and, I’m certain, will set them apart from competition that still considers lock-in and a stand-offish attitude the way to treat clients who pay them a lot of money.
Perhaps one day other PM and EMR vendors will see the light and realize the data belongs to the practice, not the vendor, and will stop taking people’s stuff only to rent access to it back to them or withhold it altogether. Until then, Amazing Charts seems to be leading the way.
I have posted previous notes about U.S. academic pathology departments and LIS vendors entering into various types of business arrangements with Chinese businesses. One involved the UPMC pathology department providing second opinions for surgical pathology cases to KingMed Diagnostics three year ago (see: UPMC Enters China Market for Second Opinions in Surgical Pathology Cases). Another involved the establishment of a global pathology network by PathCentral, a cloud-based AP-LIS, with participation by China-based Kindstar Globalgene Technology (see: PathCentral Debuts Agnostic Global Pathology Network). About three years ago, Mayo Clinic also signed a collaboration agreement with Kindstar (see: Mayo Clinic Signs Strategic Collaboration Agreement with Wuhan Kindstar Globalgene Technology, Inc. (Kindstar)). Here's a quote from that article:
...Kindstar offers 750 tests in specialties including hematology, oncology, genetics, infectious diseases, and cardiovascular disease. Through the collaboration with Mayo Clinic, Kindstar will expand its test menu offerings, promoting high-quality diagnostic and therapeutic care for the people of China by providing patients and physicians the broadest access to advanced esoteric testing services.
Now comes news that UCLA has signed a business agreement with Centre Testing to operate an esoteric lab in Shanghai with a special focus on molecular and genetic testing (see: UCLA signs agreement with Centre Testing to create clinical laboratory in Shanghai). Here is an excerpt from the article:
The University of California and UCLA Department of Pathology have signed an agreement with Centre Testing International Corp., a Chinese firm, to create a company that will operate a clinical laboratory in Shanghai. The new lab will support clinical trials and enhance medical care for Chinese patients with cancer and other diseases. The new company, CTI-Pathology/UCLA Health, is jointly owned by CTI and the University of California. The 25,000-square-foot facility - the first of its kind in China - will offer genetic and molecular diagnostics and other sophisticated tests that exceed the scope of the average lab in China, and UCLA pathologists will train Chinese lab specialists to accurately interpret the tests....The partnership is the first between a Chinese company and a U.S. academic medical center to create a specialized laboratory in China. ....UCLA will oversee management of the laboratory to ensure that its operations meet international standards for quality, and CTI will provide capital funding and marketing expertise.
This deal between UCLA and Centre strikes me as being unusual in a couple of ways. First, the new lab will focus on clinical trials as well as molecular and genetic testing. These are areas in which an academic lab like UCLA could have special expertise that would be of great value in the Chinese market. China is now one of the major sites for clinical trials because of the reduced cost of managing trials there but also because the Chinese market for prescription drugs is becoming one of the largest in the world (see: Clinical Trials Increasingly Move Offshore, Many to China; China's pharmaceutical industry -- Poised for the giant leap -- pdf). A second point is that Centre Testing International (CTI) is a testing, inspection, certification, and consulting firm but not a hospital or clinical laboratory. CTI is providing capital and marketing expertise for the project but UCLA will totally oversee management of the laboratory. This a different scenario than an American academic facility providing expertise on a case-by-case or referral basis to a Chinese lab or hospital.
You might be an #HITNerd If…
you can’t write your middle name in cursive, but you can touch type.
NEW: Check out the #HITNerd store to purchase an #HITNerd t-shirt of cell phone case.
Note: Much like Jeff Foxworthy is a redneck. I’m well aware that I’m an #HITNerd.
Of the thousand daily frustrations I experience as a radiologist, perhaps the most painful is that of the "portable patient." You see, patients migrate from hospital to hospital, from clinic to clinic, and from office to office. They may be searching for a second opinion, a superspecialist, someone who will give them the particular answer they seek (some want to hear good news, some prefer bad news), convenience, drugs, or some combination of the above.I was pretty smart back then, identifying a problem that many folks far wiser than I have been trying to solve since. And last year, I authored a follow-up article:
As often as not, they acquire a mountain of imaging studies along the way. When asked why they had a particular study at a particular site, the answer is invariably, "My doctor told me to have it there."
Add to that the dependence on our ERs for emergent (or maybe just impatient care, as I like to call it), and the ER's love of imaging studies. Put them together and you've got a collection of the patient's imaging studies spread across a city or even a state.
I've introduced you to a portable patient, and you can see what happened to her because no one knew about the examinations she had already undergone. She was irradiated, magnetized (probably less of a problem), and scared to death (arguably more damaging than radiation) because we have no way to connect the dots of her various studies.Forgive the massive regurgitation of the last post, but you must acquire (or reacquire) the mindset of the necessity of image-sharing.
Well, that isn't quite true. We do have ways -- we just aren't using them... Many years ago, when our old PACS needed replacing, I suggested to the IT types that the three hospital systems in our average town in the South combine efforts to create a single citywide PACS to serve all three hospitals and, particularly, all of their patients. I was told by the illustrious chief information officer that we couldn't even think of working with one of the other hospitals because it was "suing us" (which wasn't quite a lie ... they were challenging a certificate of need application). Millions of dollars and patient welfare down the toilet over C-suite egos.
There were and are other approaches. As an alternative to a central repository, connecting one PACS to another isn't that hard. The best way to do this -- and fulfill all HIPAA requirements in the process -- is to use an image-sharing system such as lifeImage (my personal favorite by a mile).
Don't even bother to suggest that CD-ROMs solve anything. They don't. They get lost, they get broken, they don't always load, the patient forgets to bring the disk, or the original imaging site forgets to send it, and darn, they're closed today...
At one of the clinics we staff, the clinicians come at me at least twice a day, every day, with an outside CD. After three years, I finally was able to convince the powers that be to load the damn things into PACS and merge the data with local exams. But the clinicians don't want to bother with waiting for the disks to load -- they want results now. In my opinion, CDs aren't even worthy of being drink coasters, given that huge hole in the middle. (And their older PACS rejects a significant percentage of the disks anyway.)
Here's where I'm going to anger a lot of people, and this is of course why you like to read my rantings. The following is something that needs to be said, however, and I'm going to say it.
Given that ...
... then it stands to reason that today, in the 21st century, shirking our responsibilities to the patient in this aspect of medical imaging is malpractice. Yes, I used the "M" word. But that's exactly what it is. We are not doing what we should -- and what we must -- for patient care. It is high time to apply technology that has been around for a long time to unify patients' records, imaging and otherwise.
- Not knowing that the patient has had prior studies leads to unnecessary imaging
- Unnecessary imaging may expose the patient to unnecessary radiation, costs, and anxiety
- Unnecessary radiation is bad for you, as is anxiety
- We have ways to share prior studies
We are harming our patients out of ignorance, out of hubris (why would they go to any doctor/hospital/clinic other than me/mine?), and out of greed (I get the revenue if I repeat the study!). This is completely unacceptable...
(A)pproximately 90% of duplicate and potentially unnecessary CT scans were ordered by physicians who have little to no usage of the HIE when combining slices of users with less than 500 queries in 18 months. An opportunity therefore exists to reduce the number of duplicate CT scans if the physician is utilizing HEALTHeLINK to look up information and recent test results on their patients prior to ordering more tests. In addition, this also highlights a need to get more physicians participating and using the HIE in a meaningful way as more than 70% of duplicate CT scans were ordered by physicians who did not query HEALTHeLINK.Another study from the University of Michigan found:
RESULTS:That's a lot of repeat studies. And a lot of excess radiation. We can wait for the study to be delivered from the outside place, or the outside CD to be loaded ("Film at Eleven") or we can redo the study. None of these choices are optimal. We can all see that.
In our sample there were 20,139 repeat CTs (representing 14.7% of those cases with CT in the index visit), 13,060 repeat ultrasounds (20.7% of ultrasound cases), and 29,703 repeat chest x-rays (19.5% of x-ray cases). HIE was associated with reduced probability of repeat ED imaging in all 3 modalities: -8.7 percentage points for CT [95% confidence interval (CI): -14.7, -2.7], -9.1 percentage points for ultrasound (95% CI: -17.2, -1.1), and -13.0 percentage points for chest x-ray (95% CI: -18.3, -7.7), reflecting reductions of 44%-67% relative to sample means.
HIE was associated with reduced repeat imaging in EDs. This study is among the first to find empirical support for this anticipated benefit of HIE.
I spoke with a friend today who is now the sixth person to have heard rumors about Nuance entering the image sharing market. He thinks it will announce the acquisition of a small Atlanta-based company imminently. I know the target company rather well, think highly of the founders, and I’m happy to see them finally reap some benefit from their 15-year-old startup odyssey. They started out as a small PACS company and then carved out a niche by selling data center based teleradiology PACS, which I think delivers the great majority of its $6M or so annual sales.This little company is apparently Accelarad. More on them in a moment. Back to Hamid:
We (lifeIMAGE) started out working with innovators and early adopters who believed in our cause. We believe in eliminating duplication of imaging, avoiding delays in care and excessive radiation, and improving quality of care for patients. To realize our goal, we build software that helps make medical images part of a patient’s record and helps physicians access imaging histories conveniently, from any setting. We’ll soon announce our fifth anniversary as a well funded, privately held company, with many remarkable results that make our team very proud...The cure for the portable patient indeed.
..(I)mage sharing for serving radiology, with 25,000 or so US radiologists, where Nuance has its major presence, has been around for a long time. Innovations in teleradiology are well past their prime, so, we at lifeIMAGE do not see a disruptive opportunity to innovate in that area. We are focused on the far broader need, which exists among large health systems that need to avoid the cost and problems associated with repeat imaging orders. Their ordering physicians, our end-users, are non-radiology image intensive specialists who need access to patients’ imaging histories in order to reduce the rate of repeat exams.
Recently, I’ve been fascinated with what professor Everett Rogers called “the law of diffusion of innovation.” It basically spells out that there is a point at which an innovation reaches critical mass. “The categories of adopters are: innovators, early adopters, early majority, late majority, and laggards.” The early majority buy into a technology when it’s been well vetted by innovators and early adopters first. Every innovative and disruptive company looks for the sign that its technology has started to be adopted by the “early majority.” Nuance’s entrance into the image sharing market is an indication for me that the market is getting ready for broad adoption, validating what we already see in the lifeIMAGE customer statistics. Professor Rogers suggests that once 16% of the market has signed up for a technology, that’s when the early majority starts to adopt. Current lifeIMAGE customers represent nearly 16% of all US physicians...To me, being rather more concrete than some, a "disruptive" technology is one that interrupts my workflow, and nothing could fit that definition better than what Nuance is really known for: Speech Recognition, also incorrectly known as Voice Recognition. Here we have a technology that displaces the human transcriptionist, freeing the hospital from the tyranny of employing said human and paying their salary and benefits. It dumps the work of transcribing and editing onto the radiologist with no increase in pay for the effort. And it barely works. A friend who is totally enamored with SR tried to show me how wonderful it functions in his enterprise. I watched him focus his entire attention onto the report screen, which was three monitors away from the radiographic image he was supposed to be interpreting. Yah, this is great and wonderful stuff. Now it does speed things along. My friend claims to be able to read 300 exams in 8 hours with <1% error-rate because of his beloved SR. I'll simply say that it wouldn't work that well in my hands.
lifeIMAGE is the most utilized image sharing network, designed for use by physicians across a wide range of clinical disciplines—neurology, orthopedics, cardiology, oncology, surgery, etc. Our position is unique in that our engine of innovation is fueled by this population of doctors, who encounter patients with outside imaging histories on daily basis. We also help providers with patient engagement strategies and lead the way in providing access to patients who can in turn share their imaging records with providers of their choice. So, indeed new market forces may very well validate the market and expedite adoption of our disruptive and expansive technology, innovation for which is guided by multi-disciplinary specialists, including radiologists....
When I was CEO of AMICAS, our team spent some time studying the concepts around disruptive technology. Its definition in Wikipedia is, “A disruptive innovation is an innovation that helps create a new market and value network, and eventually disrupts an existing market and value network (over a few years or decades), displacing an earlier technology.” That is what our web-based PACS was back in 1999.
Our medical imaging solution combines the ease of social networking with the clinical precision and security that medicine demands, making medical image sharing with patients, colleagues and other organizations easier than ever. Accelarad allows you to quickly and securely upload, access, manage and share medical images from any Internet-connected computer, mobile device or via our app. So you have images and reports from any originating institution, physician or system at your fingertips from a single portal, allowing you to focus on what you do best–delivering patient care.
I pushed hard for an "outside study" solution. We were regular victims of Philips PACS non-DICOM CD's every night from a particular hospital. We looked at both lifeIMAGE and Accelarad, and went with the latter and it works well for us. However, the Nuance purchase suggests to me that they want to be a complete 3rd party reading group, and replace groups like Optimal. Once they can share images well, dictate reports and disseminate results, they become a radiology department for anyone. I'll bet they start advertising over-reads/consults by big institution names before it's all over.Hey, just because you're paranoid doesn't mean they aren't out to get you...
It just looks to me like they are assembling the pieces of the puzzle to become "Uber Radiology". The video mentions/shows a graphic for telemedicine; that screams 3rd party. Any site can be set up to just put their system as a destination on each modality. Boom, you send them your images, they can be read. It's not even a "PACS to PACS transfer" but a replacement PACS. No onsite storage is needed, just the Nuance cloud.. oops until the internet is down and you don't have your images anywhere...
BURLINGTON, Mass., – April 7, 2014 – Nuance Communications, Inc., (NASDAQ: NUAN) today announced that it has named Trace Devanny as president of Nuance’s Healthcare business. Mr. Devanny will oversee Nuance’s largest division and lead its efforts to deliver a more seamless approach for healthcare professionals to document, share and use clinical information. He will report to Paul Ricci, Nuance chairman and CEO.
“Our healthcare business presents a significant opportunity for innovation, leadership and growth in today’s dynamic healthcare environment,” said Paul Ricci, chairman and CEO of Nuance. “As a healthcare technology industry veteran, Trace brings a powerful skillset that combines operational excellence, team development, customer engagement and a strategic vision. I look forward to working with him to lead Nuance and our healthcare business through its next phase of growth.”
Mr. Devanny has more than 30 years of executive leadership experience in the healthcare IT industry, having held executive leadership roles in multi-billion dollar, international healthcare organizations. He joins Nuance from TriZetto Corporation, where he served as chairman and CEO. At TriZetto, he drove revenue and bookings growth in excess of 20 percent and led the organization through a business and sales model transition. Previously, he held several executive roles at Cerner Corporation, most recently as president, over an eleven year period where he was instrumental in growing the company and revenues from a $340 million business in 1999 to a $1.8 billion healthcare IT leader. Earlier in his career, Devanny was president and COO of ADAC Healthcare Information Systems and held a series of executive positions with IBM and its healthcare business. He holds a BA degree from the University of the South.
“Improving quality of care while driving down healthcare costs is one of the most significant challenges that providers face today. Nuance is advancing these initiatives through innovative solutions that make it easier for providers to deliver patient care,” said Trace Devanny. “I look forward to working with this talented and ambitious organization to build on our momentum and make an even greater impact on the healthcare system at this important point in its history.”