June 22,2014

By James Ong, M.D., via kevinmd.com:

There was a time when doctors were doctors. There was a time when young men and women sacrificed the best years of their youth, learning to treat patients and conquer diseases, not to become typists, paper pushers, data-entry clerks, or to have hospitals, insurance companies, and the federal government dictate to us how to practice medicine.

There was a time when doctors were trusted. There was a time when we were not guilty until proven otherwise, as viewed by blood-thirsty bounty hunters like the wild dogs of the recovery audit contractor (RAC) unleashed by our ruthless federal government.

There was a time when doctor’s orders were meant to be orders. There was a time when our orders were not subject to endless scrutiny and nonsensical denials by HMOs, pharmacists, hospital formularies, insurance companies, and the federal government.

There was a time when doctor’s opinions were valued and recommendations were followed. There was a time when we were not routinely challenged by our patients, their families, their neighbors, hospital case managers, hospital administrators, medical officers of HMOs, or some random doctors on the other end of the phone 3,000 miles away.

There was a time when doctor’s progress note held useful information. There was a time when our writing contained constructive and consequential discussions on patient’s medical conditions, not a mere memo to be attached to the chart and artificially beefed up with worthless numbers, reports, graphs, and other meaningless rubbish created in anticipation of Medicare audit.

There was a time when doctor’s consultation note was a work of literary art — succinctly crafted and beautifully articulated to efficiently describe the patients and their diseases, and to effectively convey to the reader the well thought-through recommendations of its author.

There was a time when the reader of a medical chart did not have to scroll through 10 pages of repetitious, auto-filled, and computer-generated garbage mandated by the feds, the hospitals, and Joint Commission (JCAHO), only to get to an anemic, malnourished, and anticlimactic — albeit the most important — final section of “discussion and recommendation.”

There was a time when doctors actually wrote orders. There was a time when we did not have to go through 10 steps and 22 keystrokes on a hospital computer just to place a simple order like NPO, which would have otherwise taken 2 seconds with a pen.

There was a time when doctors actually saw patients. There was a time when we spent more time with our patients than with our computers — more time on listening to them than on training Dragon Dictation, typing notes, keeping up with meaningful use, writing rebuttals to RAC, drafting appeals to PPOs, filling out forms, updating problem lists, and reconciling medications on hospital EHR.

There was a time when doctors were welcomed by the hospitals and the communities they served. There was a time when hospitals assigned more parking spaces to doctors than to their own administrators. There was a time when there were no padlocks on the refrigerator in the doctor’s lounge.

There was a time when doctors actually utilized their brains at work. There was a time when practicing medicine was not just about completing forms, checking boxes, navigating pathways, meeting core measures, and predicting — before patients actually arrive at the hospital — whether they would be inpatient or outpatient according to Medicare rules.

There was a time when doctors spent more time thinking about patient care than pushing papers. There was a time when we did not have to sign more documents for a simple outpatient procedure than what is required on a home mortgage application.

There was a time when HMO was still a three-letter word. There was a time when JCAHO was still a five-letter word. There was a time when Obamacare was not a four-letter word.

There was a time when private health insurance was considered good insurance. There was a time when the acronym PPO was not used for blasphemy.

There was a time when patients actually paid their bills. There was a time when the words co-pay, deductible, and coinsurance meant what they truly meant, not some random numbers subject to inventive negotiation and crafty blackmail tactics by some patients.

There was a time when doctors were judged by their credentials and by their professional peers. There was a time when our medical school diploma meant more than the reviews on certain online social media intended for rating restaurants, plumbers, and prostitutes.

There was a time when young men and women went to medical school because having “M.D.” behind our names was the most honorable and respectable thing to do. There was a time when professional gratification for physicians was not an oxymoron.

There was a time when doctors were doctors, physicians, and surgeons. There was a time when we were not referred to as contractors, providers, or whatever other denigrating and demoralizing monikers insurance and government bodies choose to confer on all of us who have dedicated our lives to this once prestigious and highly respected profession.

There was a time when doctors were real doctors.

James Ong is a cardiologist.

June 14,2014


It has been a few years since I've made it to the Society of Nuclear Medicine annual meeting, so long actually that the Society has since changed its name, and is now called the Society of Nuclear Medicine and Molecular Imaging, I personally think this was prompted by the unfortunate amusement among the puerile (like yours truly) related to the original title's initials. SNMMI just doesn't have the same ring to it, but they don't ask me about these things. 

The exhibit hall is about 1/50th the size of the massive trade show at the RSNA, but most of the big players have a presence.

Herr Großkopf, the huge animated head, graced the Siemens booth, touting Amyvid. My new Intevo SPECT/CT was perpetually scanning a rather eerie half of a patient, also created by the geniuses of Legacy Effects. (By the way, if you want to see more on the production of Herr Großkopf, try this YouTube video:  http://youtu.be/L3gE1Zp9Vt8

It is a humbling experience to sit in the educational sessions presented by those quite a bit smarter than I am, and realize just how much I don't know. But somehow, we average rads/NM physicians do manage to muddle through. We can't all be the best of the best, but I'm grateful to those who are the best for their willingness to share their expertise with the rest of us.

I'm often asked how Nuclear Medicine differs from Radiology. The simple answer is that with the latter, we somehow externally energize the body, or pass energy through it; an X-ray is taken by passing the beam of radiation through the body and detecting what is blocked and what is transmitted. With MRI we spin your molecules, and with Ultrasound, we bounce sound waves off your tissue interfaces. You get the idea. 

Nuclear Medicine is different. Here, we inject some radioactive stuff (in the case of PET, we use honest-to-God antimatter) into your body, or make you drink or eat it, breathe it in, or in some cases instill it into places God did not intend us to instill things. We then use very sensitive detectors to see where the stuff went, how long it takes to go away, and so on. In the proper doses, some of the agents can be used for therapy as well. The stuff we give you is called colloquially a radiopharmaceutical, basically a drug with a radioactive atom attached to it, or in the case of radioiodine, the radioactive isotope of the element itself, delivered as sodium iodide, is given to scan, treat, or even ablate the thyroid, which gobbles up iodine, radioactive or not. 

The bottle of wine above isn't really a radiopharmaceutical, but I found it amusing as the term MUGA refers to a Nuclear Medicine heart scan, generically known as a Multiple Gated Acquisition. There were dozens and dozens of new scanning agents being discussed at this meeting, and at least some of them will be approved eventually for general use. Hopefully. That process takes a long, long time, and I'm told new drugs require a $4 BILLION investment each. Given this, you would think that a successful product would be celebrated and propagated.  You would be wrong.

One of the big areas of research in diagnosis and therapy involves labelled monoclonal antibodies, basically a radioactive atom attached to a natural molecule that targets some particular tissue. These so-called magic bullets have been somewhat disappointing overall, not quite magic after all. But some have worked, and in particular, a couple of these compounds, Zevalin and Bexxar, directed at treatments of some forms of lymphoma have really performed quite well. The antibody seeks out the tumor (actually a particular protein on the tumor surface) and the radioactive atom blasts away. For once, the term "cure" can be tossed around for patients who had failed other regimens. Zevalin uses yttrium-90, while Bexxar uses iodine-131 as its nuclear bomb, so to speak. 

I've administered both of these therapies. They are horrendously expensive, with pricetag of the pharmaceutical alone exceeding $26,000. Medicare paid us perhaps $15-17k, so there is a potential loss on every dose which much be made up by charging those with other coverage upwards of $100,000.  I've had more experience with Bexxar, and that experience has been good. We've achieved a number of complete remissions and we had no deaths that could be attributed directly to Bexxar. Our one Zevalin patient did not survive, but that is most likely due to the fact that this therapy wasn't applied until that particular patient was at death's door. Another shop in town did quite a few of these therapies, ending several years ago; some of the patients are now coming back with white cell depletion, a known complication when you are placing radioactive stuff right in the marrow. It's a problem, but one that they wouldn't have had without this therapy because they would most likely be dead.

One of the lectures this week at SNMMI touted the virtues of Bexxar, noting that as compared to Zevalin, its administration offers at least an attempt at radiation dosimetry. We give tiny doses to see where it goes and how fast it is eliminated, and base the final dose on that data. It's not complete, but it takes an army of physicists to do better, and out in the field, this is pretty advanced stuff.

Sadly, Bexxar, a very good treatment, is being pulled off the market. Why? Because the $100,000 per dose radiopharmaceutical couldn't turn a profit.

Jamie Reno, a Bexxar patient himself, writing in The International Business Times, bemoans the loss:
The discontinuation of Bexxar is an extreme example of a lifesaving drug being eliminated due to its relatively low profitability. In most cases, when a pharmaceutical company concludes that a niche drug is not making enough money, the product is sold to another, smaller company which continues to make it available. But continued availability is left to the discretion of the company (or companies) that owns the rights to the drug, which is how pharmaceutical companies can withhold potentially lifesaving experimental drugs that have not yet been approved by the FDA -- often, due to concerns that a potential problem resulting from such use could jeopardize the drug's ultimate approval.
Luke Timmerman, on Xconomy.com, traces Bexxar's history:
Bexxar, developed in the late ‘90s by South San Francisco-based Coulter Pharmaceutical and acquired in 2000 by Seattle-based Corixa, had a lot going for it. The drug was aimed at a protein marker called CD20, which was already a validated molecular target for cancer, based on the success a couple years earlier of a so-called “naked” antibody from Genentech and Idec Pharmaceuticals, rituximab (Rituxan). Corixa had a well-respected CEO in Steve Gillis who attracted scientific talent, and raised lots of cash. It had a Big Pharma partner in Glaxo to help it manufacture and market the drug to the fullest.


Corixa, unable to turn Bexxar into a profit center, ended up being acquired by GlaxoSmithKline in 2005.
Without boring you with reams of positive data, suffice it to say that Bexxar worked, and worked well. There was talk of using it as a secondary or even a primary therapeutic agent, rather than a last-ditch hail-Mary for those who had failed all other regimens. But it was not to be. Timmerman continues:
But there was a catch. Oncologists who saw these non-Hodgkin’s lymphoma patients could prescribe rituximab at an infusion center, along with chemotherapy. These doctors made money on every patient that went through their infusion center. Prescribing Bexxar meant they’d have to forgo that revenue stream, and refer the patient to a nuclear medicine pharmacy or radiation oncologist who could handle Bexxar or Zevalin.

“There were complicated logistics with having oncologists refer to another part of the healthcare system they normally didn’t interact with,” Rivera says. “We couldn’t get them to change their habits. The doctor would usually say ‘Oh, I’ll give the patient another course of R-CHOP’ (Rituxan plus a specific chemo regimen) instead.”

Younes, the chair of lymphoma at Memorial Sloan-Kettering, has heard the story about oncologists rejecting Bexxar because they didn’t want to refer patients to medical centers that might be seen as competitors. He says that point is “exaggerated” and notes that oncologists refer patients to other specialists all the time. He points to other problems with Bexxar’s commercialization. “It’s almost a comedy of errors,” he says.
Are we to believe that the very people who were supposed to be saving patients walked away from a cure over profit? I'm trying hard not to... There were other factors, though:
There was a muddled clinical trial strategy, Younes said. Multiple trials were opened up to expand Bexxar usage, which may have been well-intended, but the plan ended up confusing physicians about where the drug was most useful, Younes said. A lot of clinical trials were sponsored, making it possible for many patients who might have paid to get the drug to instead get it for free. Then at one point, Glaxo abruptly shut down all the trials, Younes said.

“They ended up pissing off a lot of people,” he said.

There were headaches in manufacturing an antibody that was linked to radiation. The radioactive piece of the drug came from a supplier in Canada, and the occasional snowstorm would throw the whole supply chain out of whack, causing patients infusions to be delayed, Rivera said. That was a big inconvenience for some patients who sometimes had to drive hours for a scheduled infusion at a big academic medical center, Rivera said.
So what constitutes bad sales? Reno gives us the numbers:
While Bexxar saved this writer’s life in a clinical trial in 1999 with virtually no side effects and has saved many other lives, sales of the drug did not meet GSK’s expectation. Catalina Loveman, GSK’s director of U.S. external communications, oncology, told IBTimes that total sales of Bexxar in 2012 in the U.S. and Canada were approximately $1 million; for comparison, the blockbuster drug Viagra earned Pfizer a reported $2.05 billion in sales in 2012.
Everything is relative.  I guess it didn't occur to GSK that those who survive lymphoma might eventually become Viagra customers. Oh, well. I guess a few thousand lives pale in comparison to a few million...well, need I say it?

Zevalin will remain on the market because its owner has a different outlook:
Like Bexxar, Zevalin has also struggled in the marketplace. In the third quarter 2013, Zevalin’s profits were $8 million. But unlike GSK, Spectrum Pharmaceuticals, makers of Zevalin, is committed to keeping this drug on the market.

“What is happening with Bexxar is virtually unprecedented,” said Spectrum’s chief operating officer, Ken Keller, who came to Spectrum a year and a half ago from California-based Amgen, the world's largest independent biotech company. “I do not know of a single example of a drug company that has walked away from a drug that is this effective. Typically, when a company gives up on a treatment that works this well, they will a find a smaller company to sell it.”

Keller acknowledged that neither Bexxar nor Zevalin has been able to break through and become the blockbuster drugs that he says they both should be.

“I’ll be honest: We don’t gain a lot of value from Zevalin,” he said. “We have the data that shows how well it works, but it has still not caught on with many doctors. However, Spectrum will continue to manufacture Zevalin because our CEO [Raj Shrotriya] is on a mission to make RIT the standard of care for lymphoma in the U.S.. If this were only about finances, it could lead to different decision. But this treatment saves lives, and we believe we have an obligation to cancer patients. They deserve to have access to it.”
Why was Bexxar quashed? Only GSK really knows for sure. Maybe their Board of Directors all own stock in Spectrum. Information from a friend of a friend of a friend of a cousin of a sister of a guy who knows something indicates that the move was deliberate and based purely on profit or the lack thereof. Maybe someone looked into the zillion pages of ObamaCare as currently morphed and decided GSK would NEVER see any profit, let a break-even point, on Bexxar.

It occurs to me that this whole episode (put very nicely) represents a complete disservice to our patients. We have in Bexxar a VERY good treatment, a lifesaving treatment, and it was scuttled because oncologists wouldn't use it, and they wouldn't use it because they couldn't make money on it, so neither could GSK. Does anyone else find this sickening?

But I'm prompted to think outside the box.

I've become mildly addicted to Kickstarter and Indiegogo, crowd-funding sites that promote anything from researching burritos to sending people on a one-way trip to Mars. (I've personally helped fund a couple of smart-watches that haven't come to fruition and probably won't, and a few other frivolous items.)

I don't know IF GSK would consider selling the rights to Bexxar, and if they would, I don't know how much they might want for something they buried in their corporate backyard. If the price was reasonable, I would buy the rights myself and find a way to get Bexxar back on the market. Assuming the price tag is a bit above my weekly allowance, the next step might be a Kickstarter campaign for interested parties to pick up the tab. This would include not only people like me who want Bexxar to be available, in my case as a physician, but also patients and their families as well. It would, of course, be critical to attract folks who have connections in the pharmaceutical industry who could actually put Bexxar back into production.

To my knowledge, this has not happened before. We've all heard of orphan drugs, but I can't think of any/many that were actually completely suppressed in this manner. Nor have I ever heard of a consortium of the type I propose above rescuing a valid treatment from oblivion. But this needs to be done, and I challenge you to join me to do it.

GSK? Docs? Patients? Family members? Are you listening?

May 4,2014

Dear Readers:  The following story is fictional. Any resemblance to anyone real, living, dead, or otherwise, is purely in the warped mind of the beholder.

Once upon a time, a phrase that amuses physicists no end, there was a King and Queen who lived in a modest but nicely-furnished castle in the Land of Iodine. They were what one might call enlightened. They ruled their little nation with as much kindness and wisdom as they could muster, and their subjects prospered.

The royal couple had a beautiful little daughter, Princess Xela, upon whom they doted and fussed, and due to their ministrations, or perhaps in spite of them, she grew to be a beautiful and wise woman, beloved by everyone she met. Xela's love of animals was known throughout the kingdom, and it was clear to all that she would one day become a world renowned veterinarian. And so it was.

Xela attended the finest veterinary schools in the land, and studied very, very hard. After all, there is much knowledge to be acquired to care for the multitude of God's creatures. She learnt it all, or tried to, anyway, and was celebrated by her mentors as one of the finest students they had seen in years. Of course, the very proud King and Queen made it possible for her to study in relative luxury. Her little hut on the grounds of the school wasn't particularly posh, but it was hers. This did make Xela somewhat uncomfortable, as many of her classmates were living in hovels, and selling their hair and other unmentionable body-parts to the enchantresses at neighboring schools of magic, just to eat. Princess Xela told no one of her heritage, so as not to make them jealous. If asked how she could live so well and keep her hair, she would mumble something about being lucky at rolling the bones, and change the subject.

Princess Xela of course did quite well in all her subjects. But it was during the courses of herpetology that she first encountered the Asp. The legless being saw in Xela, well, something. We of warm-blood and calm disposition probably will never understand just what it is the predator sees in the prey. The Asp wanted Xela for his own nefarious purposes. (Such has it been since our first female ancestor Eve, who was also the object of desire of a serpent, one who offered her knowledge in return for the fall of Mankind from Paradise.) By instinct, Xela shunned the Asp, for he was scaly, weird-looking, and poisonous, toxic to mankind. But the Asp, being cold-hearted but incredibly cunning, continued his gentle but relentless pursuit. He brought offerings, sacrifices he managed to catch in his fangs, and spoke endlessly of his knowledge of all things herpetic, and even pertaining to the treatment of all animals, all the while telling Princess Xela how wonderful she smelt, and how much he would love to be her companion. And gradually, ever-so-slowly, she began to think that perhaps the Asp wasn't so bad after all. Ultimately, she took him home with her, and established a place in her bed for him, so she could warm him at night; otherwise the poor Asp might freeze to death, being cold-blooded and all.

This rather unusual situation went on for quite a while, months, years. The King and Queen were of course beside themselves. Princess Xela had let them play with the Asp once or twice, not telling them just what species he was, but the monarchs were not stupid, and they rather quickly realized just what they were dealing with. Xela had grown to love the Asp, and that's all that mattered to her. She wanted to take it everywhere with here. She was taken aback when her friends and the other townspeople would shrink away in horror and fright when she drew the Asp from her pack, and she soon realized the wisdom of leaving him at home when she went to market or out to visit. But still, she longed for the day when all would accept her love for the Asp, and accept him for the wonderful being only she could see.

Xela had a brief flash of insight toward the end of her schooling. Her toxicology class had a long unit about the reptilian poisons, and she realized, ever-so-briefly, that the Asp was dangerous to her. She ran home to the palace, and, sobbing, regaled the King and Queen for hours about how close she had come to tragedy, having on multiple occasions stroked the Asp on his head while he bared his fangs in delight. She swore to put him back in his cage and never pick him up again.

But alas, Princess Xela had a kind heart, and even after her oath, she happened to pass by the Herpetorie one day, and was again drawn to the call of the Asp. And she succumbed. She tried to hide her recidivism from the King and Queen, but they happened to see the Asp peek his head out of her pack one day, and the situation was clear.

Things went downhill from there. The Princess was a modern girl, not afraid in the least of the King and Queen, and staunchly declared that she loved her "Aspie" as she now called the Asp, that she would never return him to his cage, certainly not on the orders of her old fuddy-duddy parents. She declared that she, the preeminent veterinarian-to-be, was wrong about the Asp after all, that he was really a completely different, non-poisonous species. The King and Queen were, of course, devastated, seeing imminent danger for their beloved daughter, but unable to convince her of this. Ultimately, the King even threatened to throw himself on his sword, which prompted Xela to embrace the Asp even tighter. Of course, the Asp understood none of this with his brilliant but limited reptilian mind, but he did realize that Xela was drawn to him by some strange bond, and that's all he really cared about.

The King and Queen went about the business of running the kingdom, but their broken hearts were no longer in the task, and the courtiers and regents could certainly tell that something was wrong, although they did not know just what it could be. Kings don't have the luxury, generally, of taking a prolonged period to mourn such things. All the while, the King was dealing with several crises in the kingdom. There was a failure of magic throughout the land one one occasion, wherein things that were etched in stone suddenly were not after all, and vice-versa as well. There was a drought which affected the bremsstrahlung crop, and the wealthier among the members of the court were terrified that they could no longer maintain their castles properly, not to mention their Albion-crafted chariots.

I wish I could tell you that, as in other fairy tales, all lived happily ever after. Sadly, I cannot. All I can say is that the King and Queen are doing their best to plod along through the remainder of their lives, expecting daily the messenger bearing the news that the Asp has finally done what Asps do, and poisoned their beloved Princess Xela.

The End

July 23,2014


In various conversations on how to improve patient care, the importance of health literacy is often raised. Health literacy is needed as it relates to effective patient engagement and healthy habits. Information and knowledge create greater awareness of how to live healthier and interact with doctors in a more meaningful way.

Another element of health literacy needs to include health IT literacy. With about 78% of care providers now using electronic health records (EHR) and wearable technology gaining momentum, healthcare is moving into the digital age. Patients will not need go deep into the technology, but a base understanding will be required.

Although this is not a complete list, we need to begin somewhere. Highlighted below are some basic health IT elements to raise the literacy levels of patients.

Key Health Laws

Affordable Care Act: This law generates intense feelings and debate. The Medicaid.gov site defines the Affordable Care Act in this way:

“…provides Americans with better health security by putting in place comprehensive health insurance reforms that will:

Expand coverage,

Hold insurance companies accountable,

Lower health care costs,

Guarantee more choice, and

Enhance the quality of care for all Americans.”

Essentially, the Affordable Care Act expands Medicaid coverage to low-income individuals and works toward adding improvements to our healthcare system. Read more about your healthcare rights here.

HITECH / Meaningful Use: In health IT circles, most will know what Meaningful Use is and where it came from. Move outside this circle and most will just think the drive to electronic health record adoption is a part of the Affordable Care Act (Obamacare). Meaningful Use was born out of the American Recovery and Reinvestment Act of 2009 (aka Stimulus bill) in which the Health Information Technology for Economic and Clinical Health (HITECH) was buried. Meaningful Use is a part of HITECH and, together, they seek:

“…to improve American health care delivery and patient care through an unprecedented investment in health information technology. The provisions of the HITECH Act are specifically designed to work together to provide the necessary assistance and technical support to providers, enable coordination and alignment within and among states, establish connectivity to the public health community in case of emergencies, and assure the workforce is properly trained and equipped to be meaningful users of EHRs.”

Simply stated, HITECH/Meaningful Use is an incentive program to move patient records from paper to an electronic format, which will then enable secure, efficient exchange of patient data, and provide patients easier access to their records.

Key Applications

EHR – Electronic Health Record: According to the HealthIT.gov website:

“An electronic health record (EHR) is a digital version of a patient’s paper chart. EHRs are real-time, patient-centered records that make information available instantly and securely to authorized users.”

An important element to an EHR is it contains all relevant patient information from different clinicians involved in a patient’s care.

PHR – Personal Health Record: According to American Health Information Management Association (AHIMA),

“The PHR is a tool that you can use to collect, track and share past and current information about your health or the health of someone in your care. Sometimes this information can save you the money and inconvenience of repeating routine medical tests. Even when routine procedures do need to be repeated, your PHR can give medical care providers more insight into your personal health story.”

Patients own and manage their health data – you own it, you maintain it. Having the ability to electronically receive relevant data from care providers in a usable, efficient way is very helpful.

Key Privacy and Security Elements

HIPAA – Health Insurance Portability and Accountability Act: Finding a concise definition for HIPAA is challenging. On HHS.gov, the following explanation is good:

“Most of us believe that our medical and other health information is private and should be protected, and we want to know who has this information. The Privacy Rule, a Federal law, gives you rights over your health information and sets rules and limits on who can look at and receive your health information. The Privacy Rule applies to all forms of individuals’ protected health information, whether electronic, written, or oral. The Security Rule is a Federal law that requires security for health information in electronic form.”

Even better, watch this quick video:

Your rights include saying who can see your data from clinical visits, and providers are responsible for securing your data collected during these visits.

PHI – Protected Health Information: Since protected health information was used in the HIPAA definition, we should address it. The National Institutes of Health highlights PHI as “individually identifiable health information that is transmitted or maintained in any form or medium (electronic, oral, or paper) by a covered entity or its business associates, excluding certain educational and employment records.”

Essentially, PHI is your health data.

Key Health Actions

Quantified Self: There is much more health data available because there are more tracking devices to use. Quantified Self, or wearable tech, are interchangeable terms and what it means you are proactively tracking (quantifying) your health metrics. Watches, mobile phones, apps, and other devices make the recording of your daily health information easy.

By tracking your health status, the objective is to understand your healthy habits and their impact as well as keep chronic conditions monitored and stable.

With better and timelier data, your health patterns are recognized and can be adjusted more effectively, as needed. Think diet, exercise, blood sugar, heart rate, and much more… recorded, tracked, and shared as you define.

Interoperability: Inevitably in health IT conversations, the lack or challenge of sharing patient data between providers, applications, and devices will arise. Healthcare has many data standards (e.g., HL7, X12) and different communication protocols (e.g., TCP/IP, Direct Project, Web Services).

For data to flow, each application vendor needs to open up their application or device to send and receive data. After this, the data differences need to be understood and then mapped. Integration solutions exist to orchestrate this patient data flow, but the considerations are many: application perimeters, privacy and security requirements, data specifications, workflow necessities, and more.

Interoperability is achievable and, as a patient, requesting your data in an electronic, secure way will help facilitate this requirement.

What Does Health IT Literacy Look Like?

When health IT literacy works, it looks like a more fully engaged patient. The flow of health IT literacy may look like the illustration below. Pieces of the healthcare puzzle begin to fit together and patients have a broader perspective of how it all fits together, along with their important role within the healthy flow.

Health IT Literacy

The Flow of Health IT Literacy

Raising Health IT Literacy

Healthcare has many components and, ultimately, the most essential elements are delivering high quality care in a timely and efficient manner. In the middle of this is you – the patient. Understanding what is healthy is core to health literacy. Understanding how your data is collected, stored, used, and exchanged is central to health IT literacy. We need to raise our health standards for both healthcare and health IT literacy, and this will take a community and your active participation.

What other key elements are required to raise health IT literacy? Add your thoughts and let’s expand this list to what is important for patients to grasp and use.

Categories: News and Views , All

July 22,2014


I’m a big fan of the online world. I love the ease of online banking, the efficiency of Zappos shoe shopping, and the simplicity of reading The Drudge Report for all the latest news. Someday I may also be a huge enthusiast for online patient portals, but that’s not quite the case today.

During the workday I rarely think about mundane tasks such as scheduling physicals or calling the eye doctor to order new contacts. I am more likely to recall that my daughter needs a follow-up appointment with the ENT when I notice her taking off her hearing aid for the night. Or, I’ll remember it’s time for a mammogram while sharing a bottle of wine with girlfriends and someone mentions the joys of her most recent scan. That last one happens a lot, actually.

I’d like to think I am the quintessential candidate for online patient portals: busy single mom who works full-time and is tech-savvy. I have little patience for being placed on hold for 10 minutes while listening to an endlessly looping recording about the importance of my call. I’d much rather schedule a doctor’s appointment with a few clicks on my keyboard while sipping my first cup of coffee. I get annoyed when my only communication option is to wait until the office opens at 9:00 a.m., navigate the automated phone system, listen to on-hold messages, and finally exchange forced pleasantries with a multi-tasking receptionist.

Recently I had a very positive experience using my primary care physician’s patient portal. One of my specialists requested a copy of my PCP’s referral form in order to schedule a new appointment. I accessed the PCP’s patient portal and in about two minutes found the referral and requested a copy to be forwarded to the specialist. The next day the specialist’s office called to say they had the referral in hand.

Other recent patient portal attempts have been a bit less successful. Typically if I need to schedule any type of medical appointment, I first go to the practice’s website and determine if they have an online scheduling option. That’s what I did a couple of months ago to schedule an appointment for my daughter and the whole process worked beautifully: the system asked for my preferred days and times; the next day I had an email informing me to check to practice’s portal for a message; the message informed me of the appointment time, which I then confirmed.

Unfortunately, a couple of days later my daughter reminded me of a conflict. So, back to the portal I went to send a new message requesting a reschedule. After several days I realized no one had responded to my message. I sent a second message. Again, no response. I ended up having to call the office, navigate the automated phone system, listen to on-hold messages, and finally exchange forced pleasantries with a multi-tasking receptionist.

Another one of my physicians uses a patient portal but its functionality is limited. For example, I am able to request an appointment with preferred dates and times, but rather than having an automated response, someone calls me back to finalize the appointment time. It beats having to call the office and being placed on hold, but if I miss the call or am driving, it’s back to the old-fashioned telephone method.

I often hear providers complain about the Stage 2 Meaningful Use requirement that at least five percent of patients view or download their personal health information via an online portal. Many argue the threshold is too high because many patients lack Internet access or computer expertise, or simply prefer communicating with a live person. However, I’d contend that providers are not doing themselves any favors by implementing poorly designed portals with limited functionality. As a patient, I wonder why I should use a portal if it doesn’t eliminate having to call the practice. I worry that my messages are getting “lost” – either due to technical glitches or office workflow issues. I get frustrated with confusing navigation and functionality that can’t hold a candle to what my veterinarian offers.

In a world where we can spend 10 minutes online and pay a month’s worth of bills, buy a pair of shoes, and read the day’s headlines, why is the healthcare industry so far behind in its efforts to provide patients with a consistently efficient online experience?

Categories: News and Views , All

July 17,2014


Any dancer or doctor knows full well what an incredibly expressive device your body is. 300 joints! 600 muscles! Hundreds of degrees of freedom!

The next time you make breakfast, pay attention to the exquisitely intricate choreography of opening cupboards and pouring the milk — notice how your limbs move in space, how effortlessly you use your weight and balance. The only reason your mind doesn’t explode every morning from the sheer awesomeness of your balletic achievement is that everyone else in the world can do this as well.

With an entire body at your command, do you seriously think the future of interaction should be a single finger? – Bret Victor

MultiSense ICTThe Future will be Virtual, Augmented, and Wearable

The USC Institute for Creative Technologies is a pioneer in Virtual Human (VH) technology. ICT’s work with virtual humans creates digital characters that look, sound, and behave like real people.

Understanding the human face is an especially complex process. The face contains 43 muscles, and it takes five muscles to display whether we are happy, sad, afraid, angry, disgusted or surprised.  But understanding and sensing emotions in real humans is key to making virtual characters more realistic.

VH technology is currently being used to help clinicians better interact with patients.

MultiSense and SimSensei in Healthcare 2014

ICT developed MultiSense as a way to quantify facial expressions, body posture, and speech patterns. Algorithms combine this data to create a complete picture of a user’s emotional state in real time, and with a profile that recognizes changes over time. MultiSense drives SimSensei – a next generation Virtual Human platform designed to improve healthcare decision-making and delivery.

Virtual Humans USC ICT

Learn more about MultiSense and SimSensei in healthcare.

Patients More Honest with Virtual Humans

Virtual Human technology is used in role-playing and training to help clinicians improve their interactions with patients. But new research by ICT has netted some surprising results.

New research finds patients are more likely to respond honestly to personal questions when talking to a Virtual Human.

Originally, ICT began training clinicians by having them interact with a Virtual Human patient. In the new research, the tables were turned – patients interacted with Virtual Human interviewees asking questions a physician might normally ask. The process started with general getting-to-know-you types of questions gradually leading to more personal and revealing questions like, “How close are you to your family?”

“Half of the participants were told that their conversation was entirely computer-driven and not being observed. The others were informed they were being watched by a person in another room who was also manipulating the machine to ask certain questions. In all cases, video images of their faces were recorded and later analyzed to gauge their level of emotional expression.” – Tom Jacobs, “I’ll Never Admit that to My Doctor

Surprisingly, Virtual Humans were able to extract better patient data. In discussing private matters with the computer-generated entities, patients disclosed more information. Why? According to Gale Lucas, who led the study for ICT, participants did not feel like they were being observed or judged. They also reported “significantly lower fear of self-disclosure.”

You can read more about the study in the journal Computers in Human Behavior.

Virtual Humans Will Help Predict Treatment

Across the pond, researchers in England are using Virtual Physiological Humans “to engineer a simulation of the body so true to life, any data could be potentially input to create a personalized health plan, and predictions for any future patient.”

According to Marco Viceconti, Director of the Insigneo Institute at the University of Sheffield,

“If I now feed to my simulations the data related to a particular individual, that simulation will make health predictions about the status of that individual. This is not personalized medicine, this is individualized medicine, we can finally say something about you not because you are about the same age and sex and disease as another thousand people, but because you are you with your condition and your history.”

Virtual Human Insight for Wearable Technologies

In a recent opinion piece for CIO,  Brian Eastwood writes that wearable tech’s dilemma is too much data, and not enough insight. He explains that even though he runs marathons and writes about healthcare IT, he still does not have a fitness tracker.

I started thinking about how Virtual Human technology could combine with wearable devices. Although speech recognition technology is already used with Google Glass, it is not at the level of sophistication of VH. Imagine your own Virtual Human personal trainer who would have an understanding of your emotions and behaviors, and your personal weaknesses and motivators. Interacting with your VH through speech-recognition technology would minimize the need to display lots of data on a small screen. Your VH-enabled wearable device could know just the right words and cues to promote healthy behaviors, and maximize your personal wellness.

There will be no distinction, post-Singularity, between human and machine and between physical and virtual reality. – Ray Kurzweil

Additional Viewing


Categories: News and Views , All

January 6,2014

GNUmed now supports the following workflow:

- patient calls in asking for documentation on his back pain

- staff activates patient

- staff adds from the document archive to the patient
  export area a few documents clearly related to episodes
  of back pain

- staff writes inbox message to provider assigned to patient

- provider logs in, activates patient from inbox message

- provider adds a few more documents into the export area

- provider screenshots part of the EMR into the export area

- provider includes a few files from disk into export area

- provider creates a letter from a template and
  stores the PDF in the export area

- provider notifies staff via inbox that documents
  are ready for mailing to patient

- staff activates patient from inbox message

- staff burns export area onto CD or DVD and
  mails to patient

- staff clears export area

Burning media requires both a mastering application
(like k3b) and an appropriate script gm-burn_doc
(like the attached) to be installed. Burning onto
some media the directory passed to the burn script
produces an ISO image like the attached.

GPG key ID E4071346 @ gpg-keyserver.de
E167 67FD A291 2BEA 73BD  4537 78B9 A9F9 E407 1346

November 26,2013

Here it is

0.) do a full backup. Save it on some other media then your harddisk ! Do it,

1.) Install PG 9.3 ( I tried with 32bit but should not matter).
- http://get.enterprisedb.com/postgresql/postgresql-9.3.1-1-windows.exe

2.) Run the installer and select (English_UnitedStates) for locale (others
might work as well). Make sure it installs itself on port 5433 (or other but
never ! 5432).

3.) Make sure both PG 8.4 and PG 9.3 are running (e.g. via pgadmin3 from PG

4.) open a command shell (dos box) - "run as" administrator (!) in Win7

5.) type : RUNAS /USER:postgres "CMD.EXE"
- this will open another black box (command shell) for user postgres
- for the password use 'postgrespassword' (default)

6.) type: SET PATH=%PATH%;C:\Programme\PostgreSQL\9.3\bin;
- instead of Programme it might be Program Files on your computer

7.) type: cd c:\windows\temp
- changes directory to a writable temporary directory

8.) type: pg_dump -p 5432 -Fc -f gnumedv18.backup gnumed_v18

9.) type: pg_dumpall -p 5432 --globals-only > globals.sql

Important : Protect your PG 8.4 by shutting it down temporarly

10.) type in the first command shell : net stop postgresql-8.4
- check that is says : successfully stopped

11.) psql -p 5433 -f globals.sql
- this will restore roles in the new database (PG 9.3 on port 5433)

12.) pg_restore -p 5433 --dbname postgres --create gnumedv18.backup
- this will restore the database v18 into the PG 9.3 on port 5433

Congratulations. You are done. Now to check some things.

Here you could run the fingerprint script on both databases to check for an
identical hash



13.) Open gnumed.conf in c:\programme\gnumed-client\
For the profile GNUmed database on this machine ("TCP/IP": Windows/Linux/Mac)]
change port=5432 to 5433.

14. Run the GNUmed client and check that it is working. If it works (no wrong
schema hash detected) you should see all your patient and data.

15. If you have managed to see you patients and everything is there close
GNUmed client 1.3.x.

16.) in the first command shell type: net stop postgresql-9.3

17.) Go to c:\Ptogramme\PostgresPlus\8.4SS\data and open postgresql.conf. Find
port = 5432 and change it to port = 5433

18.) Go to c:\Programme\Postgresql\9.3\data and open postgresql. Find port =
5433 and change it to 5432. This effectively switches ports for PG 8.4 and 9.3
so PG 9.3 runs on the default port 5432.

19.)  Open gnumed.conf in c:\programme\gnumed-client\
For the profile GNUmed database on this machine ("TCP/IP": Windows/Linux/Mac)]
change port=5433 to 5432.

20.) Restart PG 9.3 with: net start postgresql-9.3.

21.) Open the GNUmed client and connect (to PG 9.3 on port 5432).

22.) Leave PG 8.4 in a shutdown state.

So far we have transferred database v18 from PG 8.4 to 9.3. No data from PG
8.4 is touched/lost.

23.) Now you are free to install gnumed-server v19 and gnumed -client 1.4.
Having installed gnumed-server v19 select 'database upgrade' (not boostrap
database) and it will upgrade your v18 database to a v19 database.

In case you experience problems you can always shut down PG 9.3, switch ports again, install client 1.3.x, start PG 8.4 (net start postgresql-8.4) and work with your old setup.

November 13,2013

The release notes prominently tell us that GNUmed 1.4.x requires at least PostgreSQL 9.1.

If you are running the Windows packages and have let GNUmed install PostgreSQL for you you are good to go since it comes with PostgreSQL 9.2 already.

If you are on Ubuntu or Debian Chances are your system still has PostgreSQL 8.x installed.

First check if you run any software that requires you to continue using PostgreSQL 8.x. If so you can install PG 9.1 side by side with it. If not let PG 9.1 replace PG 8.x

It usually works like this.

sudo apt-get install postgresql-9.1
sudo pg_upgradecluster 8.4 main

Then if you don't need PG 8.4 anymore you could

sudo pg_dropcluster --stop 8.4 main
sudo apt-get purge postgresql-8.4

Have fun.

March 6,2013


Healthcare executives are continuously evaluating the subject of RFID and RTLS in general.  Whether it is to maintain the hospitals competitive advantage, accomplish a differentiation in the market, improve compliance with requirements of (AORN, JCAHO, CDC) or improve asset utilization and operating efficiency.  As part of the evaluations there is that constant concern around a tangible and measurable ROI for these solutions that can come at a significant price.

When considering the areas that RTLS can affect within the hospital facilities as well as other patient care units, there are at least four significant points to highlight:

Disease surveillance: With hospitals dealing with different challenges around disease management and how to handle it.  RTLS technology can determine each and every staff member who could have potentially been in contact with a patient classified as highly contagious or with a specific condition.

Hand hygiene compliance: Many health systems are reporting hand hygiene compliance as part of safety and quality initiatives. Some use “look-out” staff to walk the halls and record all hand hygiene actives. However, with the introduction of RTLS hand hygiene protocol and compliance when clinical staff enter or use the dispensers can now be dynamically tracked and reported on. Currently several of the systems that are available today are also providing active alters to the clinicians whenever they enter a patient’s room and haven’t complied with the hand hygiene guidelines.

Locating equipment for maintenance and cleaning:

Having the ability to identify the location of equipment that is due for routine maintenance or cleaning is critical to ensuring the safety of patients. RTLS is capable of providing alerts on equipment to staff.

A recent case of a hospital spent two months on a benchmarking analysis and found that it took on average 22 minutes to find an infusion pump. After the implementation of RTLS, it took an average of two minutes to find a pump. This cuts down on lag time in care and can help ensure that clinicians can have the tools and equipment they need, when the patient needs it.

There are also other technologies and products which have been introduced and integrated into some of the current RTLS systems available.

EHR integration:

There are several RTLS systems that are integrated with Bed management systems as well as EHR products that are able to deliver patient order status, alerts within the application can also be given.  This has enabled nurses to take advantage of being in one screen and seeing a summary of updated patient related information.

Unified Communication systems:

Nurse calling systems have enabled nurses to communicate anywhere the device is implemented within the hospital facility, and to do so efficiently. These functionalities are starting to infiltrate the RTLS market and for some of the Unified Communication firms, it means that their structures can now provide a backbone for system integrators to simply integrate their functionality within their products.

In many of the recent implementations of RTLS products, hospital executives opted to deploy the solutions within one specific area to pilot the solutions.  Many of these smaller implementations succeed and allow the decision makers to evaluate and measure the impacts these solutions can have on their environment.  There are several steps that need to be taken into consideration when implementing asset tracking systems:

•             Define the overall goals and driving forces behind the initiative

•             Develop challenges and opportunities the RTLS solution will be able to provide

•             Identify the operational area that would yield to the highest impact with RTLS

•             Identify infrastructure requirements and technology of choice (WiFi based, RFID based, UC integration, interface capability requirements)

•             Define overall organizational risks associated with these solutions

•             Identify compliance requirements around standards of use


RFID is one facet of sensory data that is being considered by many health executives.  It is providing strong ROI for many of the adapters applying it to improve care and increase efficiency of equipment usage, as well as equipment maintenance and workflow improvement. While there are several different hardware options to choose from, and technologies ranging from Wi-Fi to IR/RF, this technology has been showing real value and savings that health care IT and supply chain executives alike can’t ignore.

February 21,2013


It was not long after mankind invented the wheel, carts came around. Throughout history people have been mounting wheels on boxes, now we have everything from golf carts, shopping carts, hand carts and my personal favorite, hotdog carts. So you might ask yourself, “What is so smart about a medical cart?”

Today’s medical carts have evolved to be more than just a storage box with wheels. Rubbermaid Medical Solutions, one of the largest manufacturers of medical carts, have created a cart that is specially designed to house computers, telemedicine, medical supply goods and to also offer medication dispensing. Currently the computers on the medical carts are used to provide access to CPOE, eMAR, and EHR applications.

With the technology trend of mobility quickly on the rise in healthcare, organizations might question the future viability of medical carts. However a recent HIMSS study showed that cart use, at the point of care, was on the rise from 26 percent in 2008 to 45 percent in 2011. The need for medical carts will continue to grow; as a result, cart manufacturers are looking for innovative ways to separate themselves from their competition. Medical carts are evolving from healthcare products to healthcare solutions. Instead of selling medical carts with web cameras, carts manufacturers are developing complete telemedicine solutions that offer remote appointments throughout the country, allowing specialist to broaden their availability with patients in need. Carts are even interfaced with eMAR systems that are able to increase patient safety; the evolution of the cart is rapidly changing the daily functions of the medical field.

Some of the capabilities for medical carts of the future will be to automatically detect their location within a healthcare facility. For example if a cart is improperly stored in a hallway for an extended period of time staff could be notified to relocate it in order to comply to the Joint Commission’s requirements. Real-time location information for the carts could allow them to automatically process tedious tasks commonly performed by healthcare staff. When a cart is rolled into a patient room it could automatically open the patient’s electronic chart or give a patient visit summary through signals exchanged between then entering cart and the logging device kept in the room and effectively updated.

Autonomous robots are now starting to be used in larger hospitals such as the TUG developed by Aethon. These robots increase efficiency and optimize staff time by allowing staff to focus on more mission critical items. Medical carts in the near future will become smart robotic devices able to automatically relocate themselves to where they are needed. This could be used for scheduled telemedicine visits, the next patient in the rounding queue or for automated medication dispensing to patients.

Innovation will continue in medical carts as the need for mobile workspaces increase. What was once considered a computer in a stick could be the groundwork for care automation in the future.

September 10,2012


This has been an eventful year for speech recognition companies. We are seeing an increased development of intelligence systems that can interact via voice. Siri was simply a re-introduction of digital assistants into the consumer market and since then, other mobile platforms have implemented similar capabilities.

In hospitals and physician’s practices the use of voice recognition products tend to be around the traditional speech-to-text dictation for SOAP (subjective, objective, assessment, plan) notes, and some basic voice commands to interact with EHR systems.  While there are several new initiatives that will involve speech recognition, natural language understanding and decision support tools are becoming the focus of many technology firms. These changes will begin a new era for speech engine companies in the health care market.

While there is clearly tremendous value in using voice solutions to assist during the capture of medical information, there are several other uses that health care organizations can benefit from. Consider a recent product by Nuance called “NINA”, short for Nuance Interactive Natural Assistant. This product consists of speech recognition technologies that are combined with voice biometrics and natural language processing (NLP) that helps the system understand the intent of its users and deliver what is being asked of them.

This app can provide a new way to access health care services without the complexity that comes with cumbersome phone trees, and website mazes. From a patient’s perspective, the use of these virtual assistants means improved patient satisfaction, as well as quick and easy access to important information.

Two areas we can see immediate value in are:

Customer service: Simpler is always better, and with NINA powered Apps, or Siri like products, patients can easily find what they are looking for.  Whether a patient is calling a payer to see if a procedure is covered under their plan, or contacting the hospital to inquire for information about the closest pediatric urgent care. These tools will provide a quick way to get access to the right information without having to navigate complex menus.

Accounting and PHR interaction: To truly see the potential of success for these solutions, we can consider some of the currently used cases that NUANCE has been exhibiting. In looking at it from a health care perspective, patients would have the ability to simply ask to schedule a visit without having to call. A patient also has the ability to call to refill their medication.

Nuance did address some of the security concerns by providing tools such as VocalPassword that will tackle authentication. This would help verify the identity of patients who are requesting services and giving commands. As more intelligence voice-driven systems mature, the areas to focus on will be operational costs, customer satisfaction, and data capture.

February 5,2013


[...] medical practice billing software  encourage [...]

July 23,2014


Melissa McCormack, a medical researcher with EHR consultancy group Software Advice, recently published their medical practice management BuyerView research, which found that 63% of the buyers were replacing existing PM solutions, rather than making a first-time purchase.  This mirrors the trend we’ve seen across medical software purchasing, where the HITECH Act may have prompted hasty first purchases of EHR solutions, followed by replacements 1-2 years later. For PM vendors, this means there’s a huge opportunity to market your products to practices as an upgrade, even if they’re already using PM software. I reached out to Melissa to ask her to elaborate on the implications of the trends she found in her recent research. Here’s some advice for vendors and solutions providers.

1. As EHR meaningful use requirements grow more involved, standalone billing or scheduling systems are becoming less viable. In fact, nearly 70 percent of the buyers we spoke with wanted integration between practice management and EHR. The trend of PM buyers looking for robust EHR integration grows more pronounced each year, and shows no signs of tapering off since EHR meaningful use requirements increasingly require physicians to utilize charting, billing and scheduling in tandem. Vendors who can offer seamless integration between these applications will have a clear advantage over those who cannot.

2. Another regulatory pressure influencing PM software replacement is ICD-10. Compliance with the new code set is a major driver not only of practice management purchases in general, but specifically of replacements—25% of buyers replacing an existing solution cite a concern that their current solution wouldn’t support the code set switch. Despite the implementation deadline having been extended to October 2015, we’re seeing practices give a lot of thought to preparation, and they’re realizing the software they use will play a major role in their own readiness. Vendors who are confident in their ICD-10 readiness should take care to communicate that confidence to their existing users, as well as marketing it to prospective customers.

3. The medical practice management software buyers we talk to clearly prefer cloud-based systems. Among buyers with a preference, 88% want cloud deployment. We’re hearing from smaller practices that they value the low up-front costs, as well as not needing to maintain servers and dedicated IT staff. Additionally, buyers appreciate the remote access options afforded by cloud solutions. Some buyers even seem to conflate “cloud” with “remote access” and “mobile access” (even though those features aren’t unique to cloud-based products), suggesting these are the features of cloud-based software they are most concerned with. In fact, almost 20% of buyers identified mobile access as a top priority. Vendors who offer mobile support are at an advantage and should highlight their capabilities prominently.

4.  Practice management software buyers come from diverse roles within practices. We saw clinicians and administrative staff represented almost equally—46% and 40%, respectively—among our buyer sample. Vendors should consider their audiences when marketing their products and tailor communication accordingly, giving equal weight to the unique benefits for clinicians and administrators.

July 6,2014


I recently saw a demo of the Decisions.com platform and left impressed with the workflow engine, business rules execution, forms automation, and data integration platform. I’m very familiar with almost all the major HL7 routers and integration engines out there but Carl Hewitt, Founder and Chief Architect at Decisions, is releasing something fairly unique — an visual HL7 interface definition and integration platform for use by analysts and non-technical personnel charged with healthcare data connectivity across business workflows. I found their approach unique enough that I’ll be something that I don’t do often — a review. But, before I post the review in the coming days, I reached out to Carl to help set the stage and share the most common questions and answers we get about HL7.

What is HL7?

HL7 is how healthcare applications talk to each other – for example, when a patient is admitted to a facility, when a patient schedules an appointment, when a lab test is ordered, or when a medication is prescribed an HL7 message can be sent from one system to another. HL7 is what disparate systems use to tell each other about patient activity. HL7 is a widely adopted text based communications standard created in 1987 that can run on almost all modern hardware and software systems that support the standard.

The HL7 specification is governed by Health Level Seven International, a not-for-profit, ANSI-accredited standards developing organization.

How do HL7 systems communicate?

HL7 is human readable text that is broken up into meaningful sections and sent as data packets from application to application across well defined communication mechanisms with handshaking and acknowledgement procedures.  Because the data is sent over widely adopted communication mechanisms in a readable format (i.e., text that can be opened on any computer), HL7 tends to work pretty well.

What are the various components of HL7 messages?

Like any technology, HL7 uses a glossary of specific terms that have specific meaning.  While an interface engine alleviates the need to directly integrate with all of these concepts, understanding them will help you know what the HL7 engine is actually doing.

  • Envelope: Each communication from one system to another is contained in a block of data.  This block of data (Envelope) could contain one or more messages.
  • Message: A message is a set of data that is logically related (for instance, about a specific patient encounter or hospital admission).  A message is bounded by its type (Message Type) and contains a number of (Segments)
  • Message Type:  A message type is the definition of the structure of a communication and provides rules for what data sections (Segments) are required, which are optional, and the order of the data.  The order of the data in a message is important as one section (Segment) can have different meanings depending on where it is in the message.  Messages are usually referred to by their message type and event type, for example: an ADT^A01 is an ADT message with an event type of A01 which means that this is an Admission, Discharge, or Transfer type message (ADT) and is signaling the event that a patient was Admitted (A01).
  • Event: Events allow a message type to have slightly different meaning as described above.  Events are related to the message type and an important part of the standard.  The same message structure can be used to signal different types of events.  The same basic data is needed to admit and discharge a patient with only small differences.

HL7 has evolved over many years and new events have been added to the standard that weren’t there in previous versions.  For instance, an ADT A01 (ADT Patient Admit) and an ADT A08 (ADT Patient Data Update) were initially defined as different message types, but later combined.  So, a message that comes in as an ADT A08 in version 2.5 of HL7 will actually have the structure of an ADT A01, however, because it is sent as an A08 – the meaning of the message will be an update.  Confusing?  No worries, most interface engines hide this fact and make you think you are still getting an A08.

  • Segment: A segment is a section of a message that contains the actual values that are being sent between systems.  Generally, a segment can be thought of as a line in a message and it always starts with an identifier that indicates what the segment type is.  An example of a segment is “PID” (patient identifier) where information about a person or patient is sent, or “MSH” (message header) which is described below.  Segment identifiers are always 3 characters.
  • Message Header, MSH Segment:  Every message starts with an MSH segment that declares information about the message including its version, type, which text delimiters are used, etc.  This special segment is used to provide information as to how to interpret the entire message.
  • Optional, or Z Segments:  HL7 is a flexible standard and allows for users to send data between systems that the specification does not accommodate.  This is done by using a Z segments, which is a custom segment with an identifier that starts with Z.  Most people use the Z and then follow it with characters that are part of the specification.  For example, to send additional data about a patient identity someone might use ZID, a derivative of PID.
  • Data Type:  Every segment is composed of data elements.  Each of these data elements has a “type.”  Types may be number types like integer and decimal or dates, or more complex types defined by the standard.  These data types are positional in the ‘segment line’ and delimited by the defined character (usually a pipe, |) for the message.  A data type might have one or more pieces to it (itself being delimited by another delimiter).  For instance, a ‘Code’ data type has an ID and Description, but it is one element inside the segment.
  • Delimiters:  While the characters that can break up a message are by convention pretty standard, they can be defined in the MSH segment.  Each message segment is always on a new line, and each segment is subdivided into data by a pipe character, ‘|’, and each data type is subdivided into its elements by a caret, ‘^’.  See example message below for more information.
  • ACK/NAK:  Acknowledge or NOT acknowledge the receipt of a message.  Most HL7 systems can be configured to send back receipts of communication.  This allows systems to keep a careful audit of data sent between them.  Sending systems can generally be configured to resend or at least report messages that get no receipt or get a NACK (not acknowledge message) returned.
  • Interfaces:  An interface can mean slightly different things between software and hardware vendors, but essentially an interface is usually a connection between two systems.  When an HL7 data feed is configured from the Emergency Department in a hospital to the Electronic Medical Record system this is usually referred to as an interface.

 Some interfaces are one way – they either send data to another system or listen to data from another system.  Most interface engines or interface technologies support sending and receiving data as these technologies normally sit in between two medical systems and modify the messages as they are sent.  Different interface engines might have functionality to transform or route messages attached to interfaces.

How are HL7 text messages transmitted?

There are two primary technologies used to send message in most healthcare applications: TCP and Files.

  • Direct Connection (TCP):  Data can be transmitted over a network connection using TCP (also referred to as LLP or MLLP).  This is often referred to as a ‘direct’ connection or a real time connection between applications.
  • File Connection:  A simpler way of transmitting messages is by sending the data in a file or multiple files.  This can involve loading it into an application or configuring a directory to watch for new files to appear.  A file connection can also take the form of an FTP or SFTP server.

What does an HL7 Message Look like?

HL7 messages are made up of segments. Each carrying specific information about anything from a patient’s name to an allergy, a radiology image, a transcript, etc.

PID|||56782445^^^UAReg^PI||KLEINSAMPLE^BARRY^Q^JR||19620910|M||2028-9^^HL70005^RA99113^^XYZ|260 GOODWIN CREST DRIVE^^BIRMINGHAM^AL^35 209^^M~NICKELL’S PICKLES^10000 W 100TH AVE^BIRMINGHAM^AL^35200^^O |||||||0105I30001^^^99DEF^AN
PV1||I|W^389^1^UABH^^^^3||||12345^MORGAN^REX^J^^^MD^0010^UAMC^L||678 90^GRAINGER^LUCY^X^^^MD^0010^UAMC^L|MED|||||A0||13579^POTTER^SHER MAN^T^^^MD^0010^UAMC^L|||||||||||||||||||||||||||200605290900
OBX|1|NM|^Body Height||1.80|m^Meter^ISO+|||||F
OBX|2|NM|^Body Weight||79|kg^Kilogram^ISO+|||||F

In the message above you’ll notice some things that are ‘bold’ to call your attention to them.

  1. MSH the message header segment identifies the message.
  2. ADT^A01 shows us that we have a patient being admitted (A01) and the message structure is an ADT message.
  3. EVN is the event data with some key dates like when the message was sent
  4. PID is the patient identifier.  This has medical record numbers, names, and contact information.
  5. PV1 is the patient visit data which details this specific visit to a facility.
  6. OBX is an observation, like vitals taken by a nurse
  7. AL1 is the description of patient allergies

What type of data does HL7 Transmit?

The HL7 specification is fairly comprehensive.  It contains data about many aspects of health care and data including patients, schedules, appointments, interactions between providers and patients, insurance information, information on diagnoses and procedures, medical records and much more.  If an application is configured to receive all messages of all types from another application, it is likely that much of the data that is received is not relevant to what is needed.  For instance, if I have a scheduling application, it might not be relevant for me to get information on updates of patient allergies – but changes to patient demographic information is very important.

What are challenges with HL7 in the Healthcare Enterprise?

A growing challenge with contemporary Healthcare IT Solutions is the “app-centric” approach many vendors are taking to solving problems. With more and more of these enterprise apps being designed as standalone systems, Healthcare IT teams are faced with unique integration challenges involving sensitive patient health information.

Many teams are trying to figure out how to implement a data layer that can bring all of the healthcare provider systems (billing, lab, patient, etc.) and partner systems together so each has access to the data that it needs. Some are taking home grown approaches with custom message services and open source technologies. Others are discovering a new breed of data management tools. Let’s take a closer look at what some of the primary tools have been and what some of the new tools look like.

There are a number of unique challenges to handling a standard driven data structure such as:

  • Handling multiple versions to and from multiple sources.
  • Complicated nesting
  • Non-conformity
  • Communication Infrastructure
  • Dirty data
  • Cost of Integration



Zach Watson over at Technology Advice.com wrote a nice piece on EHR Trends in Nashville. I’m not a big fan of “trends” articles because trends aren’t that important, the implications of those trends and how to operationalize the implications are most important. I enjoyed Zach’s article so I asked him to tell us what those trends mean for EHR buyers and health IT vendors writ large. Here’s what Zach said:

Our study of office-based physicians across the city of Nashville to gain insight into which EHR systems they were using, as well as how pleased they were with their systems revealed these insights:

  • Adoption rate among certain specialties is significantly higher than available national averages
  • Satisfaction rate among Nashville doctors is higher than benchmarks established by previous studies
  • 16 percent of respondents have already switched EHRs
  • Cost appears to be the priority with Nashville buyers, with Epic not cracking the market top 5

We chose to survey Nashville because of the city’s vibrant technology market, which includes an eclectic healthcare IT industry subset. Spotlighted in the Wall Street Journal, Nashville’s healthcare market features billion dollar organization Hospital Corporation of America, and has played host to over a billion dollars of investment capital in the last decade.

For buyers of electronic medical records, several key points can be taken from the results of the study.

For specialists, a mix of specialty specific products and highly customizable options has led to higher satisfaction than previously recorded averages. For example, 75 percent of dentists in Nashville use a best of breed system, such as Dentrix or Patterson Eaglesoft. The Satisfaction of dentistry EHR users was 8.5/10: significantly higher than previously tallied national averages. Not only do best of breed EHRs still have a place in the market, it seems that providers who chose such platforms realize high satisfaction rates.

However, it’s not quite as simple as just choosing a best of breed platform. Many other specialties – from Radiology, to Pulmonary Disease, to ENTs and Podiatry – reported using eClinicalworks, the market leader in Nashville. These providers were mostly satisfied with their selection as well as eClinicalWorks averaged an 8.5/10 satisfaction rate, leaving one to wonder what the answer truly is: a more general platform, or a best of breed solution.

Examining eClinicalWorks platform can provide some insight. Specialty EHRs are defined by their alignment with the workflow of a particular type of physician, particularly in the charting feature. Basically, good templates can result in satisfied users. Though eClinicalworks services a broad market, it has a particularly robust customization feature. Branded eCliniSense, this function stores information about past diagnoses, such as labs and diagnostic imaging orders, which can then be used to construct ordersets (in this context synonymous with templates) based on usage data. This feature can make creating customized templates much easier, which can allow providers to speed up their work, rather than struggling to populate the same fields over and over again.

To be clear, eClinicalWorks is not the only software to feature highly customizable templates, but given the diversity of specialists that use it in the Nashville market it’s clear that if specialists don’t go with a best of breed platform, they should seek a platform with customizable modules and templates.

Essentially, they should find a system that lets them create their own best of breed solution.

Another surprising finding was the absence of Epic in Nashville’s office-based physician population, as well as Practice Fusion’s prominence (it has the second-highest market share). These results seem to suggest that price is a large factor in EHR purchases in the Nashville market.

Epic is renowned for its high prices, so much so that it makes for good headlines. Practice Fusion is free, and is increasing its market share on a national level at a faster pace than any other vendor (this is also supported by its strong showing in the Nashville market). Of Practice Fusion users in Nashville, 20 percent were on their second EHR, again indicating that the price of this cloud-based vendor may have been an incentive (especially if they had lost money on their last EHR investment).

Providers are often admonished not to let price rule their EHR buying process, but the Commonwealth Fund’s recent study found that small and single physician practices lag behind other EHR populations in terms of adoption. What’s slowing them up? Price.

Despite Meaningful Use incentive money, EHRs are still not cheap, and federal subsidies don’t cover the productivity loss or drop in quality scores that sometimes accompany the shift from paper to digital records. EHRs like Practice Fusion and Kareo are offering free models that these small practices can afford. Depending on which source you trust – the National Center for Healthcare Statistics’s 78 percent or SK&A’s 61 percent – the number of providers adopting EHRs has reached a tipping point, and the laggards are balking at the price.

March 12,2010

This blog is now located at http://blog.rodspace.co.uk/. You will be automatically redirected in 30 seconds, or you may click here. For feed subscribers, please update your feed subscriptions to http://blog.rodspace.co.uk/feeds/posts/default. Rodhttp://www.blogger.com/profile/12607263970096550308noreply@blogger.com0

March 3,2010

I've just heard about the Information Technology and Communications in Health (ITCH) which will be held February 24 - 27, 2011, Inn at Laurel Point, Victoria, BC Canada.I'd not heard of this conference before but the current call for papers looks interesting.Health Informatics: International Perspectives is the working theme for the 2011 international conference. Health informatics is now a Rodhttp://www.blogger.com/profile/12607263970096550308noreply@blogger.com0
The report of the Prime Minister’s Commission on the Future of Nursing and Midwifery in England sets out the way forward for the future of the professions which was published yesterday, calls for the establishment of a "high-level group to determine how to build nursing and midwifery capacity to understand and influence the development and use of new technologies. It must consider how pre- and Rodhttp://www.blogger.com/profile/12607263970096550308noreply@blogger.com0

June 9,2013


“Large collections of electronic patient records have long provided abundant, but under-explored information on the real-world use of medicines. But when used properly these records can provide longitudinal observational data which is perfect for data mining,” Duan said. “Although such records are maintained for patient administration, they could provide a broad range of clinical information for data analysis. A growing interest has been drug safety.”

In this paper, the researchers proposed two novel algorithms—a likelihood ratio model and a Bayesian network model—for adverse drug effect discovery. Although the performance of these two algorithms is comparable to the state-of-the-art algorithm, Bayesian confidence propagation neural network, by combining three works, the researchers say one can get better, more diverse results.

via www.njit.edu

I saw this a few weeks ago, and while I haven't had the time to delve deep into the details of this particular advance, it did at least give me more reason for hope with respect to the big picture of which it is a part.

It brought to mind the controversy over Vioxx starting a dozen or so years ago, documented in a 2004 article in the Cleveland Clinic Journal of Medicine. Vioxx, released in 1999, was a godsend to patients suffering from rheumatoid arthritic pain, but a longitudinal study published in 2000 unexpectedly showed a higher incidence of myocardial infarctions among Vioxx users compared with the former standard-of-care drug, naproxen. Merck, the patent holder, responded that the difference was due to a "protective effect" it attributed to naproxen rather than a causative adverse effect of Vioxx.

One of the sources of empirical evidence that eventually discredited Merck's defense of Vioxx's safety was a pioneering data mining epidemiological study conducted by Graham et al. using the live electronic medical records of 1.4 million Kaiser Permanente of California patients. Their findings were presented first in a poster in 2004 and then in the Lancet in 2005. Two or three other contemporaneous epidemiological studies of smaller non-overlapping populations showed similar results. A rigorous 18-month prospective study of the efficacy of Vioxx's generic form in relieving colon polyps showed an "unanticipated" significant increase in heart attacks among study participants.

Merck's withdrawal of Vioxx was an early victory for Big Data, though it did not win the battle alone. What the controversy did do was demonstrate the power of data mining in live electronic medical records. Graham and his colleagues were able to retrospectively construct what was effectively a clinical trial based on over 2 million patient-years of data. The fact that EMR records are not as rigorously accurate as clinical trial data capture was rendered moot by the huge volume of data analyzed.

Today, the value of Big Data in epidemiology is unquestioned, and the current focus is on developing better analytics and in parallel addressing concerns about patient privacy. The HITECH Act and Obamacare are increasing the rate of electronic biomedical data capture, and improving the utility of such data by requiring the adoption of standardized data structures and controlled vocabularies.

We are witnessing the dawning of an era, and hopefully the start of the transformation of our broken healthcare system into a learning organization.


Source: FutureHIT

June 7,2013


I believe if we reduce the time between intention and action, it causes a major change in what you can do, period. When you actually get it down to two seconds, it’s a different way of thinking, and that’s powerful. And so I believe, and this is what a lot of people believe in academia right now, that these on-body devices are really the next revolution in computing.

via www.technologyreview.com

I am convinced that wearable devices, in particular heads-up devices of which Google Glass is an example, will be playing a major role in medical practice in the not-too-distant future. The above quote from Thad Starner describes the leverage point such devices will exploit: the gap that now exists between deciding to make use of a device and being able to carry out the intended action.

Right now it takes me between 15 and 30 seconds to get my iPhone out and do something useful with it. Even in its current primitive form, Google Glass can do at least some of the most common tasks for which I get out my iPhone in under five seconds, such as taking a snapshot or doing a Web search.

Closing the gap between intention and action will open up potential computing modalities that do not currently exist, entirely novel use case scenarios that are difficult even to envision before a critical mass of early adopter experience is achieved.

The Technology Review interview from which I extracted the quote raises some of the potential issues wearable tech needs to address, but the value proposition driving adoption will soon be truly compelling.

I'm adding some drill-down links below.

Source: FutureHIT

Practices tended to use few formal mechanisms, such as formal care teams and designated care or case managers, but there was considerable evidence of use of informal team-based care and care coordination nonetheless. It appears that many of these practices achieved the spirit, if not the letter, of the law in terms of key dimensions of PCMH.

via www.annfammed.org

One bit of good news about the Patient Centered Medical Home (PCMH) model: here is a study showing that in spite of considerable challenges to PCMH implementation, the transformations it embodies can be and are being implemented even in small primary care practices serving disadvantaged populations.

Source: FutureHIT

July 9,2014

Dear ERCIM News Reader,

ERCIM News No. 98 has just been published at:

Special Theme: "Smart Cities"

featuring a keynote by Eberhard van der Laan, Mayor of Amsterdam

Guest editors:
- Ioannis Askoxylakis, ICS-FORTH, Greece
- Theo Tryfonas, Faculty of Engineering, University of Bristol, UK

This issue is also available for download as:
pdf:  http://ercim-news.ercim.eu/images/stories/EN98/EN98-web.pdf
epub: http://ercim-news.ercim.eu/images/stories/EN98/EN98.epub

Next issue: No. 99, October 2014 - Special Theme: "Quality Software"
(see Call at http://ercim-news.ercim.eu/call)

Thank you for your interest in ERCIM News.
Feel free to forward this message to others who might be interested.

Best regards,
Peter Kunz
ERCIM News central editor

Urban Civics - Democratizing Urban Data for Healthy Smart Cities
CityLab@Inria - A Lab on Smart Cities fostering Environmental and Social Sustainability
‘U-Sense’, A Cooperative Sensing System for Monitoring Air Quality in Urban Areas 
is published quarterly by ERCIM, the European Research Consortium for Informatics and Mathematics.
The printed edition will reach about 6000 readers.
This email alert reaches over 7500 subscribers.
ERCIM - the European Research Consortium for Informatics and Mathematics - aims to foster collaborative work within the European research community and to increase co-operation with European industry. Leading European research institutes are members of ERCIM. ERCIM is the European host of W3C.

Follow us on twitter http://twitter.com/#!/ercim_news
and join the open ERCIM LinkedIn Group
Categories: News and Views , All

July 5,2014


In learning we often look in turn for role models, exemplars then even some comparator or examples against which to compare and contrast to understand the context and our own knowledge, skills and potential.

In health and social care information systems it's useful for me to look at what is available in Drupal. A recent find is Care2X with a demo available. There are numerous plans to take these systems further:

 Care3g is seeking funding.

There is also Project Mtuha.

There's a post by Tim Schofield - Helping African hospitals with open source software that describes how the enterprise resource planning system KwaMoja @KwaMoja is being used to provide administration systems for hospitals in Africa.

Even though not on the same scale as commercial hospital systems in the USA and EU ... these are significant software projects compared with my purposes which are educational.

Care2X presentation
Categories: News and Views , All

April 17,2014

Physical-Cyber-Social Computing

Final submissions due: 1 September 2014
Publication issue: May/June 2015

Please email the guest editors a brief description of the article you plan to submit by 15 August 2014

Guest Editors: Payam Barnaghi, Manfred Hauswirth, Amit Sheth, and Vivek Singh (ic3-2015 AT computer.org)

Computing, communication, and mobile technologies are among the most influential innovations that shape our lives today. Technology advancements such as mobile devices that reach over half of Earth's population, social networks with more than a billion members, and the rapid growth of Internet-connected devices (the Internet of Things) offer a unique opportunity to collect and communicate information among everybody and everything on the planet. Interacting with the physical world enriches our existing methods of information exchange — sharing our thoughts, communicating social events, and work collaboration via the new dimension of physical computing. This all-encompassing "new world of information" requires that we be able to process extremely large volumes of data to extract knowledge and insights related to our surrounding environment, personal life, and activities, on both local and global scales.

These trends have led to an emergence of physical-cyber-social (PCS) computing, which involves a holistic treatment of data, information, and knowledge from the physical, cyber, and social worlds to integrate, understand, correlate, and provide contextually relevant abstractions to humans and the applications that serve them. PCS computing builds on and significantly extends current progress in cyber-physical, socio-technical, and cyber-social systems. This emerging topic seeks to provide powerful ways to exploit data that are available through various IoT, citizen and social sensing, Web, and open data sources that have either seen or will soon see explosive growth. Providing interoperable information representations and extracting actionable knowledge from the deluge of human and machine sensory data are key issues.

This special issue seeks innovative contributions to computer systems and interaction design, information processing and knowledge engineering, and adaptive solutions associated with PCS computing and the novel applications it enables. Potential topics include:
  • semantics and information modeling; semantic integration, fusion, and abstraction strategies;
  • stream processing and reasoning on complex PCS data; real-time feedback control and response systems; human/event/situation-centered views of data streams;
  • pattern recognition, trend detection, anomaly and event detection, semantic event processing, and inferring actionable knowledge techniques;
  • spatio-temporal, location-aware, continuous, scalable, and dynamic analysis;
  • security, privacy, and trust issues in collection, storage, and processing; and
  • novel and significant PCS applications, deployments, and evaluations in areas including personalized and contextualized information and alerts, health, biomedicine, smart cities, and human/social/economic development.
Submission Guidelines

All submissions must be original manuscripts of fewer than 5,000 words, focused on Internet technologies and implementations. All manuscripts are subject to peer review on both technical merit and relevance to IC's international readership — primarily practicing engineers and academics who are looking for material that introduces new technology and broadens familiarity with current topics. We do not accept white papers, and we discourage strictly theoretical or mathematical papers. To submit a manuscript, please log on to ScholarOne (https://mc.manuscriptcentral.com:443/ic-cs) to create or access an account, which you can use to log on to IC's Author Center and upload your submission.

My source:
Announcements mailing list
Announcements AT ubicomp.org
Categories: News and Views , All

October 14,2012


Image of clipboard with checklist


Twitter, like the Internet in general, has become a vast source of and resource for health care information. As with other tools on the Internet it also has the potential for misinformation to be distributed. In some cases this is done by accident by those with the best intentions. In other cases it is done on purpose such as when companies promote their products or services while using false accounts they created.

In order to help determine the credibility of tweets containing health-related content I suggest the using the following checklist (adapted from Rains & Karmikel, 2009):

  1. Author: Does the tweet contain a first and last name? Can this name be verified as being a real person by searching it on the Internet?
  1. Date: When was the tweet sent? If it is a re-tweet when was the original tweet sent?
  1. Reference: Does the tweet reference a source? Is this source reliable?
  1. Statistics: Does the tweet make claims of effectiveness of a product or service using statistics? Are the statistics used properly?
  1. Personal story or testimonials: Does the tweet contain claims from an individual who has used or conducted research on the product or service? Is this individual credible?
  1. Quotations: Does the tweet quote or cite another source of information (e.g. a link) that can be checked? Is this source credible?

Ultimately it is up to the individual to determine how to use health information they find on Twitter or other Internet sources. For patients anecdotal or experiential information shared by others with the same illness may be considered very credible. Others conducting research may find this a less valuable information source. Conversely a researcher may only be looking for tweets that contain reference to peer-reviewed journal articles whereas patients and their caregivers may have little or no interest in this type of resource.


Rains, S. A., & Karmike, C. D. (2009). Health information-seeking and perceptions of website credibility: Examining Web-use orientation, message characteristics, and structural features of websites. Computers in Human Behavior, 25(2), 544-553.






June 26,2012


The altmetric movement is intended to develop new measures of production and contribution in academia. The following article provides a primer for research scholars on what metrics they should consider collecting when participating in various forms of social media.



If you participate on Twitter you should be keeping track of the number of tweets you send, how many times your tweets are replied to, re-tweeted by other users and how many @mentions (tweets that include your Twitter handle) you obtain. ThinkUp is an open source application that allows you to track these metrics as well as other social media tools such as Facebook and Google +. Please read my extensive review about this tool. This service is free.


You should register with a domain shortening service such as bit.ly, which will provide you with an API key that you can enter into applications you use to share links. This will provide a means to keep track of your click-through statistics in one location. Bit.ly records how many times a link you created was clicked on, the referrer and location of the user. Consider registering your own domain name and using it to shorten your tweets as a means of branding. In addition, you can use your custom link on electronic copies of your CV or at your own web site. This will inform you when your links have been clicked on. You should also consider using bit.ly to create links used at your web site, providing you with feedback on which are used the most often. For example, all of the links in this article were created using my custom bit.ly domain. In addition, you can tweet a link to any research study you publish to publicize as well as keep track of how many clicks are obtained. Bit.ly is a free service.


Another tool to measure your tweets is TweetReach. This service allows you to track the reach of your tweets by Twitter handle or tweet. It provides output in formats that can be saved for use elsewhere (Excel, PDF or the option to print or save your output by link). To use these latter features you must sign up for an account but the service is free.


Buffer is a tool that allows you to schedule your tweets in advance. You can also connect Buffer to your bit.ly account so links used can be included in your overall analytics. Although Buffer provides its own measures on click-through counts this can contradict what appears in bit.ly. This service is free but also has paid upgrade options available that provide more detailed analytics.

Web presence

Google Scholar Citation Profile

You can set up a profile with Google Scholar based on your publication record. The metrics provided by this service include a citation count, h-index and i10-index. When someone searches your name using Google Scholar your profile will appear at the top before any of the citations. This provides a quick way to separate your articles from someone else who has the same name as you.

Google Feedburner for RSS feeds

If you maintain your own web site and use RSS feeds to announce new postings you can also collect statistics on how many times your article is clicked on. Feedburner, recently acquired by Google provides one way to measure this. You enter your RSS feed ULR and a report is generate, which can be saved in CVS format.

Journal article download statistics

Many journals provide statistics on the number of downloads of articles. Keep track of those associated with your publication by visiting the site. For example, BioMed Central (BMC) maintains an access count of the last 30 days, one year and all time for each of your publications.


Other means of contributing to the knowledge base in your field include participating on web-based forums or web sites such as Quora. Quora provides threaded discussions on topics and allows participants to both generate and respond to the question. Other users vote on your responses and points are accrued. If you want another user to answer your question you must “spend” some of your points. Providing a link to your public profile on Quora on your CV will demonstrate another form of contribution to your field.


Paper.li is a free service that curates content and renders it in a web-based format. The focus of my Paper.li is the use of technology in Canadian Healthcare. I have also created a page that appears at my web site. Metrics on the number of times your paper has been shared via Facebook, Twitter, Google + and Linked are available. This service is free.


Twylah is similar to paper.li in that it takes content and displays it in a newspaper format except it uses your Twitter feed. There is an option to create a personalized page. I use tweets.lauraogrady.ca. I also have a Twylah widget at my web site that shows my trending tweets in a condensed magazine layout. It appears in the side bar. This free service does not yet provide metrics but can help increase your tweet reach. If you create a custom link for your Twylah page you can keep track of how many people visit it.

Analytics for your web site

Log file analysis

If you maintain your own web site you can use a variety of tools to capture and analyze its use. One of the most popular applications is Google Analytics. If you are using a content management system such as WordPress there are many plug-ins that will add the code to the pages at your site and produce reports. WordPress also provides a built-in analytic available through its dashboard.

If you have access to the raw log files you could use a shareware log file program or the open source tool Piwik. These tools will provide summaries about what pages of your site are visited most frequently, what countries the visitors come from, how long visitors remain at your site and what search terms are used to reach your site.


All of this information should be included in the annual report you prepare for your department and your tenure application. This will increase awareness of altmetrics and improve our ability to have these efforts “count” as contributions in your field.

June 24,2012

  1. The following provides a timeline of articles that appeared in newspapers and blogs from January 2011 to present. The articles demonstrate a progress from patient engagement in online communities to those that include reference to increasing provider involvement.
  2. January 5th, 2011
  3. February 3rd, 2011
  4. February 22nd, 2011
  5. March 23rd, 2011
  6. April 2nd, 2011
  7. April 25th, 2011
  8. May 14th, 2011

Follow Us: