September 9,2014

16:36



'Nuff said.

September 8,2014

21:30
Another incredibly powerful post published on KevinMD.com, this from an anonymous medical student. Read it and weep. I did.

It was 4:30 a.m., and I was on the side of the road, drenched in sweat and tears. I had finally slowed my breathing to normal. I was going to be late for rounds. No time to obsess over possible questions. No time to memorize lab values, or practice regurgitating them.

I thought of home. My family and friend, who I hadn’t seen in months. I cringed when I estimated how long it had been since I called them. And the place itself. The dry, clean heat of the desert. The pump jacks that dotted the landscape. The men with their muddy work boots and weathered skin. The brave, unconventional beauty, the humility of the region. And my heart ached to be there, to go back to a time where I was bright and hopeful. I think that’s where most of my sadness came from. Grieving the loss of her, the girl who wanted to do something that mattered.


I attended my dream school. I remember the day that I received my acceptance letter as one of the happiest in my life. I was going to learn from some of the smartest doctors in the world. I felt blessed. As a young man, my grandfather had crossed the border to pick cotton. His third grade education and shaky English would keep him working manual labor jobs for his entire life. My father was the first to graduate from high school. He, like most men back home, worked in the oilfields. And I was going to medical school. My family couldn’t help me fill out the applications or pay for the MCAT (I worked at a coffee shop to cover that). But they were my biggest fans, my cheerleading squad.

My decision to choose medicine was emotionally motivated. My mother became very sick during my junior year. She spent months in hospitals, on respirators and feeding tubes. I watched my mom suffering, and I hated that I didn’t understand what was going on, that I couldn’t help. Soon after she came home, I announced I was going to medical school. I had never been so sure.

We experienced intense stress and pressure to perform, to produce results. Early on, I stopped attending lectures, and watched from home. I could speed up the recording and learn twice as fast, I reasoned. Alone in the small apartment that my loans afforded me once I paid the hefty tuition bill, I worked diligently to produce what were considered mediocre grades at my institution. It is difficult to explain the isolation, the emptiness of this time. Those are two years I’ll never get back. Two years of youth and good health spent in an apartment.

I would call my friends and family often in the beginning, sobbing and anxious. But how could they understand? To them, to the outside, a doctor’s life seemed very glamorous indeed. After a while, I stopped calling.

The only patient contact I received were not real patients. They were actors. Once or twice a semester, we would conduct earnest interviews with these pretend patients. We would be timed, filmed, and graded. Even our interactions with other human beings were carefully scripted and judged. If my university believed in one thing, it was that there was no human enterprise on Earth that could not be held to a rubric. They had yet to fail in their quest to quantify, to measure all of the qualities of an ideal doctor.

Then the grand finale: step 1, or as I like to call it: “The Most Important Test On The Planet: If You Screw Up You Will Never Get The Residency That You Have Dreamed About Since You Were Three Years Old.” Weeks of cramming material into my head. I drank coffee. I studied. Period. I was motivated by the promise of the clinical years. I was finally going to be able to interact with humans again. I prayed that the motivation, the drive I had lost somewhere along the way would return.

My happiest times in school were early in the morning, before the residents and the attendings were around to expose the holes in my knowledge, or reprimand me for forgetting to test cranial nerve IX, or scold me for my presentation being too long (or too short, depending on the person.) It was listening to my patients as they told me about their children. Their patience as I clumsily stumbled through the interview. The way their face relaxed as I told them that I would bring up their concerns to the doctor. Holding their hands and telling them it was going to be alright. Laughing, connecting, loving. Ironically, the shortest parts of my day. No time for that sort of thing with notes to write, tests to study for, articles to look up.

I attempted to explain the situation to the school psychologists. I tried to convey the sense of loss, the unmet expectations, the dying of a dream. I was told I was experiencing severe depression and anxiety, feelings that were internally generated. No possible flaw in the system, they rationalized. After all, there were rubrics. I was assured it would take months to treat me. Best to get on with it, numb up in time for the next rotation. Instead, I took a leave of absence.

I have been silent for too long. I have asked, “What’s wrong with me?” when I should have been asking, “What’s wrong with this?” I am compassionate and hardworking, yet I have been daily made to feel inadequate. I have been isolated from the people and the pursuits I love. I have given up everything, paid thousands of dollars, thousands of hours. I have repeated to myself over and over, “there is only medical school.” I almost believed it.

I never understood the trend of loss of empathy during medical training. Until now. See, when you’re in so much pain that if you thought of your life past this moment, this singular point in time, you would implode, pain seems as natural as breathing. Pain is part of life. Pain is nothing. You can’t stop to nurse your own wounds, you can’t talk about how much you hurt. So how could you possibly have enough room in your broken heart to take on someone else’s pain? So you don’t. You cover your bases and survive. You become that machine that you swore you’d never become. Because it hurts too much to feel, and it’s so much easier to float than swim.

I fantasize daily about leaving medicine for the endless sky back home. I miss the person that I was so very much. But I’m still here. And I hold onto my faded dreams in my little hands.

Why?

I remember that hospital room that smelled of isopropyl alcohol and sickness. I remember changing the sheets my mom soiled because the nursing staff was short in our small hospital. I remember the cold, detached doctors that came for ten minutes once a day. I remember how they spoke in riddles, how they seemed so far away. I remember.

I promise I won’t forget. I’ll never forget.

August 19,2014

22:20
Because one of our sites has decided to replace some generations-old equipment, I had the joy of going on two site visits over the past couple of weeks. Both were sponsored by BIG NAMES, and both fell rather short. Which prompts me to examine the entire concept of the site visit.

In brief, the site needs two quite different pieces of equipment, both sold by the BIG VENDORS in question. Both teams got only half of it right, one showing us the first, and one showing us the second. Both seemed to be a little oblivious to the fact that we needed one of each. My recommendation at this point is to buy one machine from one vendor and the other from the other. I doubt that will happen.

So what went wrong? I'm not totally sure, but I think it probably comes down to someone not listening. I think we made our needs pretty clear, but...

Site visits can be fun, at least they were in the old days. I've been on what might have been one of the more expensive equipment junkets in the history of imaging. We had two Elscint CT's at the time, and the company wanted us to consider their MRI's. Our trip started at Elscint HQ in Haifa, Israel, and then took us to Kiel, Germany to see the only prototypes in existence of the machines we sought. The machines were actually quite impressive. Elscint had created one of the first high-field scanners, a 2T device, as well as a dual-gradient machine. There was just one little catch. The week before we left for the trip, Elscint was SOLD! GE purchased the nuclear medicine and MRI divisions, and Picker (later Philips) snagged the CT business. So GE ate the bill for me and my partner to look at scanners that were never manufactured! We did have a good time, though.

What is the point of a site visit? To see the machine? Here's a little secret: Most every scanner is a big box with a hole in it. Some have prettier cowling than others, some have a water-chiller in the corner, which looks rather like a fridge. Some have really nice LCD displays over the gantry. Whoopie. More importantly, one gets the chance to talk to the users, technologists, physicians, whomever. Usually, the salesmen have the tact to disappear for a moment so the bad stuff can be discussed as well as the good. (Bad stuff does come out..on our Fuji PACS site visit years ago, the PACS admin said, "Fuji support isn't so good and we have to maintain the system ourselves." Which was the end of Fuji.)

Of course, the most important part is the obligatory meal at vendor expense. But the days of picking the most expensive wine on the list are gone, and frankly I never felt terribly comfortable spending the vendors' money on frivolity anyway. Not that a fancy meal or trip can or should influence my choice, but the optics are what they are.

Ultimately, I think the days of the site-visit are numbered.

My friend Mike Cannavo, once again the One and Only PACSMan, ghost-wrote this paragraph for my RSNA Christmas Carol fantasy:
“Isn’t it obvious?” (the PACSMan) asked. “Here’s the deal. No one knows where healthcare is going, so we’re all going to start enjoying Thanksgiving again for the first time in 75 years. Instead of freezing our asses off, we’ll do an interactive virtual conference with scheduled demos and everything. No muss, no fuss, and no ‘free’ meals. As a bonus, system prices will drop 30% because vendors won’t have to pay for RSNA. It’s sheer brilliance, I tell ya!"
Mike was referring to the vendor extravaganza at RSNA, but I think this applies to site-visits as well. There is simply no need to haul people across the countryside (or country, for that matter) to see the scanner. They all look pretty much the same, and decisions are not made on the basis of their appearance. (Bore size and other specs are important, but that's all in the specs.)

Conversations with the important people can be choreographed by phone with little difficulty. And images, the most important piece of my puzzle, can be sent, hopefully in a form that will easily load on the customers' PACS. (Yes, that can be a problem.)

Hey, I like a paid day off as much as anyone else, but I'm getting too old to drag my carcass around the neighborhood and indeed the country to spend 5 minutes in the presence of the Holey Box and its keepers . Let's save a few thousands (or tens of thousands) of dollars and try it my way.

I've probably just made myself a target for those who like getting wined and dined and taken to various exotic places like we just were, but time change, boys. Go spend the time with your family instead. That goes for the vendors, too.

September 17,2014

7:02

20 Questions for health it 12

Question number 11 of our “20 Questions for Health IT” project. Please comment in the comments section or on twitter using the #20HIT tag. View the other questions and comments here

Question from David Muntz, SVP and CIO at GetWellNetwork:

Given your interest in patients and families, how satisfied are you with the CCDA as a means for data exchange with patient oriented apps?

Partially satisfied. I would really like to see HL7 standards created that make it possible for the patient to become the custodian of their own data. That’s really only possible if standards for data migration are created that enable a patient to securely pull complete data from all their sources of choice. Those sources would include not just EHRs, but the whole of health information technology. The patient could then choose when and how much to share using other existing standards, such as CCDA, or these new standards.

Categories: News and Views , All

September 16,2014

6:00

20 Questions for health IT 11

Question number 11 of our “20 Questions for Health IT” project. Please comment in the comments section or on twitter using the #20HIT tag. View the other questions and comments here

Question from Mandi Bishop, health IT consultant:

What are/could health IT innovators do to markedly improve healthcare and achieve Triple Aim goals, in lieu of forced compliance with CMS/ONC mandates, to offset Meaningful Use incentive dollars and associated reimbursement penalties?

Governance.

Don’t laugh. I’m serious!

Instituting governance is the single most innovative thing health IT can do in lieu of Meaningful Use that would offset the loss of incentive dollars and reimbursement penalties, improve population health, and improve patient clinical outcomes.

Sound like Triple Aim goals? It should. You don’t have to implement new technology to make the most of what you have. It’s all about the process, baby.

How much faster would your revenue cycle be if 30% more patient data was valid?

What if identifying a vocabulary owner of LOINC freed your lab from its local compendium, and you were able to reconcile and aggregate your own labs with third-party labs? How many fewer unnecessary tests might you order?

How is the patient experience improved when the patient and all caregivers have access to accurate, timely, relevant health information from all their care sources?

Innovating is thinking outside the box. Right now, that box is Meaningful Use – the program where deadlines don’t just discourage, but blatantly ignore, fundamental IT governance principles, and our nation’s patients have become unwitting lab rats in the grandest series of human trials the FDA never realized it should have reviewed.

So be a rebel. Get back to basics. Block, tackle, and institute governance.

Categories: News and Views , All

September 15,2014

6:00

20 Questions for Health IT 10

Question number 10 of our “20 Questions for Health IT” project. Please comment in the comments section or on twitter using the #20HIT tag. View the other questions and comments here

Question from Rob Brull, product manager at Corepoint Health:

In what ways does it make sense to extend Direct Project beyond those already defined by Meaningful Use?

Most of the implementations I am aware of for Direct protocol involve the sending and receiving of Summary of Care documents to satisfy the Transfer of Care (ToC) requirement for Meaningful Use. There is one radiology site my company is working with to send a Diagnostic Imaging Report via Direct. My thought is that lab orders, lab results, and radiology reports are obvious workflows for implementing the Direct protocol. No VPN or agent is required and hopefully all provider facilities will already have a Direct address based on ToC requirements. The real hurdle will be for vendors to extend their capabilities to send and receive payloads other than Summary of Care.

Categories: News and Views , All

January 6,2014

16:11
GNUmed now supports the following workflow:

- patient calls in asking for documentation on his back pain

- staff activates patient

- staff adds from the document archive to the patient
  export area a few documents clearly related to episodes
  of back pain

- staff writes inbox message to provider assigned to patient

- provider logs in, activates patient from inbox message

- provider adds a few more documents into the export area

- provider screenshots part of the EMR into the export area

- provider includes a few files from disk into export area

- provider creates a letter from a template and
  stores the PDF in the export area

- provider notifies staff via inbox that documents
  are ready for mailing to patient

- staff activates patient from inbox message

- staff burns export area onto CD or DVD and
  mails to patient

- staff clears export area

Burning media requires both a mastering application
(like k3b) and an appropriate script gm-burn_doc
(like the attached) to be installed. Burning onto
some media the directory passed to the burn script
produces an ISO image like the attached.

Karsten
--
GPG key ID E4071346 @ gpg-keyserver.de
E167 67FD A291 2BEA 73BD  4537 78B9 A9F9 E407 1346

November 26,2013

5:10
Here it is

0.) do a full backup. Save it on some other media then your harddisk ! Do it,
now.

1.) Install PG 9.3 ( I tried with 32bit but should not matter).
- http://get.enterprisedb.com/postgresql/postgresql-9.3.1-1-windows.exe

2.) Run the installer and select (English_UnitedStates) for locale (others
might work as well). Make sure it installs itself on port 5433 (or other but
never ! 5432).

3.) Make sure both PG 8.4 and PG 9.3 are running (e.g. via pgadmin3 from PG
9.3)

4.) open a command shell (dos box) - "run as" administrator (!) in Win7

5.) type : RUNAS /USER:postgres "CMD.EXE"
- this will open another black box (command shell) for user postgres
- for the password use 'postgrespassword' (default)

6.) type: SET PATH=%PATH%;C:\Programme\PostgreSQL\9.3\bin;
- instead of Programme it might be Program Files on your computer

7.) type: cd c:\windows\temp
- changes directory to a writable temporary directory

8.) type: pg_dump -p 5432 -Fc -f gnumedv18.backup gnumed_v18

9.) type: pg_dumpall -p 5432 --globals-only > globals.sql

Important : Protect your PG 8.4 by shutting it down temporarly

10.) type in the first command shell : net stop postgresql-8.4
- check that is says : successfully stopped

11.) psql -p 5433 -f globals.sql
- this will restore roles in the new database (PG 9.3 on port 5433)

12.) pg_restore -p 5433 --dbname postgres --create gnumedv18.backup
- this will restore the database v18 into the PG 9.3 on port 5433

Congratulations. You are done. Now to check some things.

########################################
Here you could run the fingerprint script on both databases to check for an
identical hash

https://gitorious.org/gnumed/gnumed/source/f4c52e7b2b874a65def2ee1b37d8ee3fb3566ceb:gnumed/gnumed/server/gm-fingerprint_db.py

########################################

13.) Open gnumed.conf in c:\programme\gnumed-client\
For the profile GNUmed database on this machine ("TCP/IP": Windows/Linux/Mac)]
change port=5432 to 5433.

14. Run the GNUmed client and check that it is working. If it works (no wrong
schema hash detected) you should see all your patient and data.

15. If you have managed to see you patients and everything is there close
GNUmed client 1.3.x.

16.) in the first command shell type: net stop postgresql-9.3

17.) Go to c:\Ptogramme\PostgresPlus\8.4SS\data and open postgresql.conf. Find
port = 5432 and change it to port = 5433

18.) Go to c:\Programme\Postgresql\9.3\data and open postgresql. Find port =
5433 and change it to 5432. This effectively switches ports for PG 8.4 and 9.3
so PG 9.3 runs on the default port 5432.

19.)  Open gnumed.conf in c:\programme\gnumed-client\
For the profile GNUmed database on this machine ("TCP/IP": Windows/Linux/Mac)]
change port=5433 to 5432.

20.) Restart PG 9.3 with: net start postgresql-9.3.

21.) Open the GNUmed client and connect (to PG 9.3 on port 5432).

22.) Leave PG 8.4 in a shutdown state.

So far we have transferred database v18 from PG 8.4 to 9.3. No data from PG
8.4 is touched/lost.

23.) Now you are free to install gnumed-server v19 and gnumed -client 1.4.
Having installed gnumed-server v19 select 'database upgrade' (not boostrap
database) and it will upgrade your v18 database to a v19 database.

In case you experience problems you can always shut down PG 9.3, switch ports again, install client 1.3.x, start PG 8.4 (net start postgresql-8.4) and work with your old setup.

November 13,2013

7:26
The release notes prominently tell us that GNUmed 1.4.x requires at least PostgreSQL 9.1.

If you are running the Windows packages and have let GNUmed install PostgreSQL for you you are good to go since it comes with PostgreSQL 9.2 already.

If you are on Ubuntu or Debian Chances are your system still has PostgreSQL 8.x installed.

First check if you run any software that requires you to continue using PostgreSQL 8.x. If so you can install PG 9.1 side by side with it. If not let PG 9.1 replace PG 8.x

It usually works like this.

sudo apt-get install postgresql-9.1
sudo pg_upgradecluster 8.4 main

Then if you don't need PG 8.4 anymore you could

sudo pg_dropcluster --stop 8.4 main
sudo apt-get purge postgresql-8.4

Have fun.

March 6,2013

11:53

Healthcare executives are continuously evaluating the subject of RFID and RTLS in general.  Whether it is to maintain the hospitals competitive advantage, accomplish a differentiation in the market, improve compliance with requirements of (AORN, JCAHO, CDC) or improve asset utilization and operating efficiency.  As part of the evaluations there is that constant concern around a tangible and measurable ROI for these solutions that can come at a significant price.

When considering the areas that RTLS can affect within the hospital facilities as well as other patient care units, there are at least four significant points to highlight:

Disease surveillance: With hospitals dealing with different challenges around disease management and how to handle it.  RTLS technology can determine each and every staff member who could have potentially been in contact with a patient classified as highly contagious or with a specific condition.

Hand hygiene compliance: Many health systems are reporting hand hygiene compliance as part of safety and quality initiatives. Some use “look-out” staff to walk the halls and record all hand hygiene actives. However, with the introduction of RTLS hand hygiene protocol and compliance when clinical staff enter or use the dispensers can now be dynamically tracked and reported on. Currently several of the systems that are available today are also providing active alters to the clinicians whenever they enter a patient’s room and haven’t complied with the hand hygiene guidelines.

Locating equipment for maintenance and cleaning:

Having the ability to identify the location of equipment that is due for routine maintenance or cleaning is critical to ensuring the safety of patients. RTLS is capable of providing alerts on equipment to staff.

A recent case of a hospital spent two months on a benchmarking analysis and found that it took on average 22 minutes to find an infusion pump. After the implementation of RTLS, it took an average of two minutes to find a pump. This cuts down on lag time in care and can help ensure that clinicians can have the tools and equipment they need, when the patient needs it.

There are also other technologies and products which have been introduced and integrated into some of the current RTLS systems available.

EHR integration:

There are several RTLS systems that are integrated with Bed management systems as well as EHR products that are able to deliver patient order status, alerts within the application can also be given.  This has enabled nurses to take advantage of being in one screen and seeing a summary of updated patient related information.

Unified Communication systems:

Nurse calling systems have enabled nurses to communicate anywhere the device is implemented within the hospital facility, and to do so efficiently. These functionalities are starting to infiltrate the RTLS market and for some of the Unified Communication firms, it means that their structures can now provide a backbone for system integrators to simply integrate their functionality within their products.

In many of the recent implementations of RTLS products, hospital executives opted to deploy the solutions within one specific area to pilot the solutions.  Many of these smaller implementations succeed and allow the decision makers to evaluate and measure the impacts these solutions can have on their environment.  There are several steps that need to be taken into consideration when implementing asset tracking systems:

•             Define the overall goals and driving forces behind the initiative

•             Develop challenges and opportunities the RTLS solution will be able to provide

•             Identify the operational area that would yield to the highest impact with RTLS

•             Identify infrastructure requirements and technology of choice (WiFi based, RFID based, UC integration, interface capability requirements)

•             Define overall organizational risks associated with these solutions

•             Identify compliance requirements around standards of use

Conclusion

RFID is one facet of sensory data that is being considered by many health executives.  It is providing strong ROI for many of the adapters applying it to improve care and increase efficiency of equipment usage, as well as equipment maintenance and workflow improvement. While there are several different hardware options to choose from, and technologies ranging from Wi-Fi to IR/RF, this technology has been showing real value and savings that health care IT and supply chain executives alike can’t ignore.

February 21,2013

14:41

It was not long after mankind invented the wheel, carts came around. Throughout history people have been mounting wheels on boxes, now we have everything from golf carts, shopping carts, hand carts and my personal favorite, hotdog carts. So you might ask yourself, “What is so smart about a medical cart?”

Today’s medical carts have evolved to be more than just a storage box with wheels. Rubbermaid Medical Solutions, one of the largest manufacturers of medical carts, have created a cart that is specially designed to house computers, telemedicine, medical supply goods and to also offer medication dispensing. Currently the computers on the medical carts are used to provide access to CPOE, eMAR, and EHR applications.

With the technology trend of mobility quickly on the rise in healthcare, organizations might question the future viability of medical carts. However a recent HIMSS study showed that cart use, at the point of care, was on the rise from 26 percent in 2008 to 45 percent in 2011. The need for medical carts will continue to grow; as a result, cart manufacturers are looking for innovative ways to separate themselves from their competition. Medical carts are evolving from healthcare products to healthcare solutions. Instead of selling medical carts with web cameras, carts manufacturers are developing complete telemedicine solutions that offer remote appointments throughout the country, allowing specialist to broaden their availability with patients in need. Carts are even interfaced with eMAR systems that are able to increase patient safety; the evolution of the cart is rapidly changing the daily functions of the medical field.

Some of the capabilities for medical carts of the future will be to automatically detect their location within a healthcare facility. For example if a cart is improperly stored in a hallway for an extended period of time staff could be notified to relocate it in order to comply to the Joint Commission’s requirements. Real-time location information for the carts could allow them to automatically process tedious tasks commonly performed by healthcare staff. When a cart is rolled into a patient room it could automatically open the patient’s electronic chart or give a patient visit summary through signals exchanged between then entering cart and the logging device kept in the room and effectively updated.

Autonomous robots are now starting to be used in larger hospitals such as the TUG developed by Aethon. These robots increase efficiency and optimize staff time by allowing staff to focus on more mission critical items. Medical carts in the near future will become smart robotic devices able to automatically relocate themselves to where they are needed. This could be used for scheduled telemedicine visits, the next patient in the rounding queue or for automated medication dispensing to patients.

Innovation will continue in medical carts as the need for mobile workspaces increase. What was once considered a computer in a stick could be the groundwork for care automation in the future.

September 10,2012

9:35

This has been an eventful year for speech recognition companies. We are seeing an increased development of intelligence systems that can interact via voice. Siri was simply a re-introduction of digital assistants into the consumer market and since then, other mobile platforms have implemented similar capabilities.

In hospitals and physician’s practices the use of voice recognition products tend to be around the traditional speech-to-text dictation for SOAP (subjective, objective, assessment, plan) notes, and some basic voice commands to interact with EHR systems.  While there are several new initiatives that will involve speech recognition, natural language understanding and decision support tools are becoming the focus of many technology firms. These changes will begin a new era for speech engine companies in the health care market.

While there is clearly tremendous value in using voice solutions to assist during the capture of medical information, there are several other uses that health care organizations can benefit from. Consider a recent product by Nuance called “NINA”, short for Nuance Interactive Natural Assistant. This product consists of speech recognition technologies that are combined with voice biometrics and natural language processing (NLP) that helps the system understand the intent of its users and deliver what is being asked of them.

This app can provide a new way to access health care services without the complexity that comes with cumbersome phone trees, and website mazes. From a patient’s perspective, the use of these virtual assistants means improved patient satisfaction, as well as quick and easy access to important information.

Two areas we can see immediate value in are:

Customer service: Simpler is always better, and with NINA powered Apps, or Siri like products, patients can easily find what they are looking for.  Whether a patient is calling a payer to see if a procedure is covered under their plan, or contacting the hospital to inquire for information about the closest pediatric urgent care. These tools will provide a quick way to get access to the right information without having to navigate complex menus.

Accounting and PHR interaction: To truly see the potential of success for these solutions, we can consider some of the currently used cases that NUANCE has been exhibiting. In looking at it from a health care perspective, patients would have the ability to simply ask to schedule a visit without having to call. A patient also has the ability to call to refill their medication.

Nuance did address some of the security concerns by providing tools such as VocalPassword that will tackle authentication. This would help verify the identity of patients who are requesting services and giving commands. As more intelligence voice-driven systems mature, the areas to focus on will be operational costs, customer satisfaction, and data capture.

February 5,2013

18:01

[...] medical practice billing software  encourage [...]

September 1,2014

12:50

Cambridge HealthTech Institute (CHI) invited me to attend their Next Generation Point of Care Diagnostics Conference and I came away thoroughly impressed with the content, speakers, and organization. Since I chair several conferences a year I know how hard it is to pull off a good one so I’d like to thank CHI for a job well done. While I took the notes and attended the event, this post was written by HITSphere‘s Vik Subbu, our Digital Health editor that focuses on Bio IT and Pharma IT. Bio IT, Pharma IT, Health IT, and MedTech are all going to be merging over the next few years and Vik will be helping our audience understand those shifts and what they mean to Digital Health innovators. Here’s Vik’s recap of the conference:

Goals & Attendees

The goal of the event was to provide a progress update to the healthcare industry on the advances in next generation point-of-care (POC) diagnostics while highlighting the advent of innovative platforms and use of digital information systems to aid in the development of novel POC diagnostics. The conference was attended by industry experts from various disciplines ranging from academic institutions, non-profit computational and bioinformatics centers, venture capital, service providers, pharmaceutical, diagnostic and biotechnology companies.

Why does Point of Care Dx matter to Digital Health innovators?

The interactions and cross-fertilization of ideas among various disciplines in the diagnostic arena was the highlight of the conference. The ability to have real time interactions between academic researchers, clinicians, product developers and reimbursement specialists provided a ‘one stop’ venue for an attendee to obtain a holistic overview of both the promises and pitfalls in developing point-of-care diagnostics. The outcome of the conference should yield greater public-private collaborations involving novel platforms, available NGS datasets, and academic laboratories. Such partnerships will hopefully enable the industry to overcome product development and reimbursement barriers while paving the way for effective and streamlined approval process for next generation POC diagnostics. All of this will help integrate POC better into next generation Digital Health innovations.

The intimate setting and the organization of the parallel track discussions/presentations were well designed and covered key aspects of POC diagnostics. For one looking to learn the current and future directions of POC diagnostics, the conference provided a nice platform to learn, understand and meet key contacts to support their individual interests. Entrepreneurs and innovators focusing on bridging the “gap” between healthcare IT and diagnostics will find that there was a recurring theme that surfaced in many of the presentation but wasn’t really the focal point of any one specific presentation. That topic was data. There were many presentations that highlighted the “use of genomic data” or “the use of computational super tools to assimilate or generate vast amounts of data” or “  the need for better data standards to achieve meaningful results”. While these were great presentations, none of the speakers focused on the “HOW” piece (which is a huge opportunity for entrepreneurs). For example, “”how can one can gain broader insights from these datasets?”  or “how can we solve the issues of standardization of datasets?”. Perhaps, this was the homework assignment that we must complete in time for next year’s conference.

Top Ten Insights for Healthcare IT innovators:

  1. Next Generation Sequencing (NGS) will continue to play a vital role in  disease detection and biomarker identification
  2. The increasing availability of publicly available datasets from the FDA and academia will help guide the development of next generation POC diagnostics
  3. Point of care diagnostics for hospital acquired infectious diseases remains an unmet need
  4. Need for improving sensitivity and specificity of diagnostic assay platforms is acute
  5. Reimbursement discussions need to occur with payers from day one
  6. Early stage diagnostic companies can benefit from innovative business models and strategic partnerships
  7. Clinical samples are required to validating an assay or biomarker – yet finding these longitudinal samples remains a challenge
  8. Software tools and POC diagnostics have improved the identification of diseases and better patient outcomes…..but we have a long way to go
  9. Establishing better workflows, processes and teams can lead to better outcomes
  10. Integrating disparate datasets can yield better insights and patient outcomes

August 29,2014

16:17

Given the number of breaches we’ve seen this Summer at healthcare institutions, I’ve just spent a ton of time recently on several engineering engagements looking at “HIPAA compliant” encryption (HIPAA compliance is in quotes since it’s generally meaningless). Since I’ve heard a number of developers say “we’re HIPAA compliant because we encrypt our data” I wanted to take a moment to unbundle that statement and make sure we all understand what that means. Cryptology in general and encryption specifically are difficult to accomplish; CISOs, CIOs, HIPAA compliance officers shouldn’t just believe vendors who say “we encrypt our data” without asking for elaboration in these areas:

  • Encryption status of data at rest in block storage (the file system that the apps, databases, VMs, are stored on)
  • Encryption status of data at rest in virtual machine block storage
  • Encryption status of data at rest in archived storage (backups)
  • Encryption status of data at rest in the Oracle/SQL*Server/DB2/MySQL/Postgre/(your vendor) databases (which sits on top of the file system)
  • Encryption status of data in transit from database to app server
  • Encryption status of data in transit from app server to proxy server (HTTP server)
  • Encryption status of data in transit from proxy server to end user’s client
  • Encryption status of data in transit from API servers to end user’s clients (iOS, Android, etc.)
  • Encryption status of server to server file transfers
  • Encryption key management in all of the above

When you look at encrypting data, it’s not just “in transit” or “at rest” but can be in transiting or resting in a variety of places.

If you care about security, ask for the details.

August 21,2014

1:00

These days it’s pretty easy to build almost any kind of software you can imagine — what’s really hard, though, is figuring out what to build. As I work on complex software systems in government, medical devices, healthcare IT, and biomedical IT I find that tackling vague requirements is one of the most pervasive and difficult problems to solve. Even the most experienced developers have a hard time building something that has not been defined well for them; a disciplined software requirements engineering approach is necessary, especially in safety critical systems. One of my colleagues in France, Abder-Rahman Ali, is currently pursuing his Medical Image Analysis Ph.D. and is passionate about applying computer science to medical imaging to come up with algorithms and systems that aid in Computer Aided Diagnosis (CAD). He’s got some brilliant ideas, especially in the use of fuzzy logic and storytelling to elicit better requirements so that CAD may become a reality some day. I asked Abder-Rahman to share with us a series of blog posts about how to tackle the problem of vague requirements. The following is his first installment, focused on storytelling and how it can be used in requirements engineering: 

I remember when I was a child how my grandmother used to tell us those fictional and non-fictional stories. They still ring in my ears, even after those many years that have passed by. We used to just sit down, open our ears, stare our eyes, move around with our thoughts, and we don’t get out of such situation until the story ends. We used to make troubles sometimes, and to get us calm, we were just being called to hear that story, and the feelings above came to use again.

Phebe Cramer, in her book, Storytelling, Narrative, and the Thematic Apperception Test, mentions how storytelling has a long tradition in human history. She highlights what have been considered the significant means by which man told his story. Some of those for instance were the famous epic poems, the Iliad and the Odyssey from the ninth century B.C., the Aeneid from 20 B.C., the east Indian Mahabharata and Ramayana from the fourth century A.C., …etc. This is how history was transmitted from one generation to the other.

Storytelling Tips and Tales emphasizes that stories connect us to the past, and enlighten for us the future, lessons can be learned from stories, and information is transmitted transparently and smoothly through stories. Teachers in schools are even being encouraged to use storytelling at their classrooms. The books also believes that storytelling is an engaging process that is rewarding for both the teller and the listener. Listeners will like enter new worlds by just hearing the words of the teller. Schank and Abelson even see that psychological studies have revealed that human beings learn best from stories, in their Knowledge and Memory: The Real Story.

Having mentioned that, a requirements engineer may ask, why couldn’t we just then bring storytelling to our domain? Especially that in our work, there would be a teller and a listener. Well, could that really be?

Let us examine the relationships between story elements and a software requirement in order to answer that question.

In his book, Telling Stories: A Short Path to Writing Better Software Requirements, Ben Rinzler highlights such relationships as follows (some explanations for the points was also used from Using Storytelling to Record Requirements: Elements for an Effective Requirements Elicitation Approach):

  1. Conflict: This is the problem you want to solve in the requirements process. An example of that is the conflict that occurs between stakeholders needs and the FDA regulatory requirements for some medical device software.
  2. Theme:  This is the central concept underlying the solution. For requirements engineering, this could be a “requirement”, that is, the project goal.
  3. Setting: Knowing that the setting is the place and time of the story. In requirements engineering, this can be stated as the broader concept of the problem at hand, such as providing information about the technology environment, business, …etc.
  4. Plot: The plot of a story is its events that occur in a certain order, such that their outcome affects later once. In requirements engineering, this is the current and future systems’ series of actions.
  5. Character: This refers to any entity capable of action. In requirements engineering, this can for instance represent people, machines, and programs.
  6. Point of view: Having different points of view is important for providing a unified view that tries to provide a whole description of what is actually happening, and what everyone needs. This is like describing a medical device software process from the patient and physician points of view for instance.

So, yes, a relationship and an analogy exists between storytelling and software requirements.

In future posts in the series, Shahid and I will dig more deep on how storytelling could be employed in the requirements engineering process, and will also try to show how can fuzzy logic be embedded in the process to solve any issues that may be inherent in the storytelling method.

Meanwhile, drop us comments if there are specific areas of requirements engineering complex software systems that you’re especially interested in learning more about.

March 12,2010

11:01
This blog is now located at http://blog.rodspace.co.uk/. You will be automatically redirected in 30 seconds, or you may click here. For feed subscribers, please update your feed subscriptions to http://blog.rodspace.co.uk/feeds/posts/default. Rodhttp://www.blogger.com/profile/12607263970096550308noreply@blogger.com0

March 3,2010

4:07
I've just heard about the Information Technology and Communications in Health (ITCH) which will be held February 24 - 27, 2011, Inn at Laurel Point, Victoria, BC Canada.I'd not heard of this conference before but the current call for papers looks interesting.Health Informatics: International Perspectives is the working theme for the 2011 international conference. Health informatics is now a Rodhttp://www.blogger.com/profile/12607263970096550308noreply@blogger.com0
3:59
The report of the Prime Minister’s Commission on the Future of Nursing and Midwifery in England sets out the way forward for the future of the professions which was published yesterday, calls for the establishment of a "high-level group to determine how to build nursing and midwifery capacity to understand and influence the development and use of new technologies. It must consider how pre- and Rodhttp://www.blogger.com/profile/12607263970096550308noreply@blogger.com0

June 9,2013

16:10

“Large collections of electronic patient records have long provided abundant, but under-explored information on the real-world use of medicines. But when used properly these records can provide longitudinal observational data which is perfect for data mining,” Duan said. “Although such records are maintained for patient administration, they could provide a broad range of clinical information for data analysis. A growing interest has been drug safety.”

In this paper, the researchers proposed two novel algorithms—a likelihood ratio model and a Bayesian network model—for adverse drug effect discovery. Although the performance of these two algorithms is comparable to the state-of-the-art algorithm, Bayesian confidence propagation neural network, by combining three works, the researchers say one can get better, more diverse results.

via www.njit.edu

I saw this a few weeks ago, and while I haven't had the time to delve deep into the details of this particular advance, it did at least give me more reason for hope with respect to the big picture of which it is a part.

It brought to mind the controversy over Vioxx starting a dozen or so years ago, documented in a 2004 article in the Cleveland Clinic Journal of Medicine. Vioxx, released in 1999, was a godsend to patients suffering from rheumatoid arthritic pain, but a longitudinal study published in 2000 unexpectedly showed a higher incidence of myocardial infarctions among Vioxx users compared with the former standard-of-care drug, naproxen. Merck, the patent holder, responded that the difference was due to a "protective effect" it attributed to naproxen rather than a causative adverse effect of Vioxx.

One of the sources of empirical evidence that eventually discredited Merck's defense of Vioxx's safety was a pioneering data mining epidemiological study conducted by Graham et al. using the live electronic medical records of 1.4 million Kaiser Permanente of California patients. Their findings were presented first in a poster in 2004 and then in the Lancet in 2005. Two or three other contemporaneous epidemiological studies of smaller non-overlapping populations showed similar results. A rigorous 18-month prospective study of the efficacy of Vioxx's generic form in relieving colon polyps showed an "unanticipated" significant increase in heart attacks among study participants.

Merck's withdrawal of Vioxx was an early victory for Big Data, though it did not win the battle alone. What the controversy did do was demonstrate the power of data mining in live electronic medical records. Graham and his colleagues were able to retrospectively construct what was effectively a clinical trial based on over 2 million patient-years of data. The fact that EMR records are not as rigorously accurate as clinical trial data capture was rendered moot by the huge volume of data analyzed.

Today, the value of Big Data in epidemiology is unquestioned, and the current focus is on developing better analytics and in parallel addressing concerns about patient privacy. The HITECH Act and Obamacare are increasing the rate of electronic biomedical data capture, and improving the utility of such data by requiring the adoption of standardized data structures and controlled vocabularies.

We are witnessing the dawning of an era, and hopefully the start of the transformation of our broken healthcare system into a learning organization.

 

Source: FutureHIT

June 7,2013

13:51

I believe if we reduce the time between intention and action, it causes a major change in what you can do, period. When you actually get it down to two seconds, it’s a different way of thinking, and that’s powerful. And so I believe, and this is what a lot of people believe in academia right now, that these on-body devices are really the next revolution in computing.

via www.technologyreview.com

I am convinced that wearable devices, in particular heads-up devices of which Google Glass is an example, will be playing a major role in medical practice in the not-too-distant future. The above quote from Thad Starner describes the leverage point such devices will exploit: the gap that now exists between deciding to make use of a device and being able to carry out the intended action.

Right now it takes me between 15 and 30 seconds to get my iPhone out and do something useful with it. Even in its current primitive form, Google Glass can do at least some of the most common tasks for which I get out my iPhone in under five seconds, such as taking a snapshot or doing a Web search.

Closing the gap between intention and action will open up potential computing modalities that do not currently exist, entirely novel use case scenarios that are difficult even to envision before a critical mass of early adopter experience is achieved.

The Technology Review interview from which I extracted the quote raises some of the potential issues wearable tech needs to address, but the value proposition driving adoption will soon be truly compelling.

I'm adding some drill-down links below.

Source: FutureHIT
11:22

Practices tended to use few formal mechanisms, such as formal care teams and designated care or case managers, but there was considerable evidence of use of informal team-based care and care coordination nonetheless. It appears that many of these practices achieved the spirit, if not the letter, of the law in terms of key dimensions of PCMH.

via www.annfammed.org

One bit of good news about the Patient Centered Medical Home (PCMH) model: here is a study showing that in spite of considerable challenges to PCMH implementation, the transformations it embodies can be and are being implemented even in small primary care practices serving disadvantaged populations.

Source: FutureHIT

September 9,2014

14:08

Hi Peter

We are delighted to introduce our new series of Health Insights. These free to attend events for healthcare professionals feature interactive round table activities, news on how the latest innovations support the health and care community, and best practice experiences from NHS Trust colleagues.

CLICK HERE TO SEE NEW DATES AND LOCATIONS

Starting in Leeds and Newbury this October and held in association with NHS England, each one day conference will feature:

Digital Discovery Sessions

- facilitated round tables exploring procurement issues

An update from NHS England on Tech Funds and Open Source Programme
Host Roy Lilley, popular Healthcare Broadcaster, with lively panel debates

Speakers will include Rob Webster, CEO of NHS Confederation, Tim Straughan, Director of Health and Innovation at Leeds and Partners, and Clive Kay, Chief Executive of Bradford Teaching Hospitals.

REGISTER FREE TODAY

We hope to see you at your local Health Insights.

Kind regards

Samantha Phillips
HIMSS UK

Categories: News and Views , All

September 5,2014

17:53
Press Information   .   Communiqué de presse   .   Comunicado de prensa


Geneva, Switzerland & Copenhagen, Denmark, 1 September 2014

The International Council of Nurses (ICN) and the International Health Terminology Standards Development Organisation (IHTSDO) today announced an updated collaboration agreement to advance terminology harmonisation and foster interoperability in health information systems.  The new collaboration agreement signed today will be reviewed on completion of the work or in April 2016, whichever is earliest.

The overarching goals of this collaboration are to ensure that nurses worldwide have the tools they need to carry out their jobs effectively, that they are not disenfranchised from the global informatics infrastructure, and that they remain active in the collection of meaningful and useful health information.

As part of the collaboration agreement, ICN, owner of the International Classification for Nursing Practice (ICNP), and IHTSDO, owner of SNOMED CT, have agreed to undertake further work that defines the relations between SNOMED CT and ICNP to enable their interoperability in health information systems globally. It builds on work already undertaken to produce an equivalence table for nursing diagnoses.

In the coming years IHTSDO and ICN will focus on two key areas of work: joint publication of a completed equivalence table between SNOMED CT and ICNP for Nursing Diagnoses, and joint publication of a completed equivalence table between SNOMED CT and ICNP for nursing interventions.

“ICN is delighted to extend our collaboration with IHTSDO,” said David Benton, ICN’s Chief Executive Officer.  “This agreement will be of mutual benefit to both organisations as well as to patients and will improve the description and comparison of nursing practice locally, regionally, nationally and internationally.

“IHTSDO is pleased to be continuing its collaboration with the ICN”, said Jane Millar, Head of Collaboration at IHTSDO. “Our joint work in linking SNOMED CT and ICNP will be of benefit to the nursing profession worldwide to ensure a common understanding and interoperability, and also to support sharing with other members of the healthcare team.”

The first collaborative agreement between the two organisations was signed in 2010, and in January 2014 the cooperation was further advanced by the announcement of an equivalence table between ICNP concepts and SNOMED CT concepts.

Notes to the Editor:

ICN and IHTSDO are the developers of the International Classification for Nursing Practice (ICNP) and SNOMED Clinical Terms (CT), respectively. The ICNP terminology serves a critical role for ICN in representing the domain of nursing practice worldwide, thus providing nurses at all levels with data-based information used for practice, administration, education and research. SNOMED CT is a multidisciplinary healthcare terminology designed to support the entry and retrieval of clinical concepts in electronic record systems and the safe, accurate, and effective exchange of health information.

About ICN:
The International Council of Nurses (ICN) is a federation of more than 130 national nurses associations representing the millions of nurses worldwide. Operated by nurses and leading nursing internationally, ICN works to ensure quality care for all and sound health policies globally. (www.icn.ch).

About IHTSDO:
IHTSDO determines global standards for health terms, an essential part of improving the health of humankind. Its experts work collaboratively with diverse stakeholders to ensure that SNOMED CT, its world-leading terminology product, is accepted around the world as the common language for health (www.ihtsdo.org).

My source: Amy L Amherdt
Categories: News and Views , All

August 30,2014

7:28
The benefits of information technology across all sectors are well recognised when they are realised:

individual
INTERPERSONAL : SCIENCES
humanistic ------------------------------------------- mechanistic
SOCIOLOGY : POLITICAL
group
creativity
usability
readiness to hand
context

information storage and retrieval, access, efficiency, space, security, information sharing, patient safety, legibility

digital inclusion

cost, savings, governance, reporting (locally, nationally, internationally), policy integration



As we head towards a paperless NHS let's not forget that health care is both an art and a science.

Links:

The Digital Challenge (due for an update?)

Digitising the NHS by 2018 - One Year On. techUK report | March 2014.

What of the impact of the pending election 2015? Time inconsistency problem:
The NHS needs a 'Bank of England moment' HSJ.

drawMD Pediatrics - Patient Education by Drawing on Medical Artwork for Healthcare Providers

Medical-Artist

Categories: News and Views , All

October 14,2012

20:05

Image of clipboard with checklist

 

Twitter, like the Internet in general, has become a vast source of and resource for health care information. As with other tools on the Internet it also has the potential for misinformation to be distributed. In some cases this is done by accident by those with the best intentions. In other cases it is done on purpose such as when companies promote their products or services while using false accounts they created.

In order to help determine the credibility of tweets containing health-related content I suggest the using the following checklist (adapted from Rains & Karmikel, 2009):

  1. Author: Does the tweet contain a first and last name? Can this name be verified as being a real person by searching it on the Internet?
  1. Date: When was the tweet sent? If it is a re-tweet when was the original tweet sent?
  1. Reference: Does the tweet reference a source? Is this source reliable?
  1. Statistics: Does the tweet make claims of effectiveness of a product or service using statistics? Are the statistics used properly?
  1. Personal story or testimonials: Does the tweet contain claims from an individual who has used or conducted research on the product or service? Is this individual credible?
  1. Quotations: Does the tweet quote or cite another source of information (e.g. a link) that can be checked? Is this source credible?

Ultimately it is up to the individual to determine how to use health information they find on Twitter or other Internet sources. For patients anecdotal or experiential information shared by others with the same illness may be considered very credible. Others conducting research may find this a less valuable information source. Conversely a researcher may only be looking for tweets that contain reference to peer-reviewed journal articles whereas patients and their caregivers may have little or no interest in this type of resource.

Reference

Rains, S. A., & Karmike, C. D. (2009). Health information-seeking and perceptions of website credibility: Examining Web-use orientation, message characteristics, and structural features of websites. Computers in Human Behavior, 25(2), 544-553.

 

 

 

 

 

June 26,2012

14:35

The altmetric movement is intended to develop new measures of production and contribution in academia. The following article provides a primer for research scholars on what metrics they should consider collecting when participating in various forms of social media.

Twitter

ThinkUp

If you participate on Twitter you should be keeping track of the number of tweets you send, how many times your tweets are replied to, re-tweeted by other users and how many @mentions (tweets that include your Twitter handle) you obtain. ThinkUp is an open source application that allows you to track these metrics as well as other social media tools such as Facebook and Google +. Please read my extensive review about this tool. This service is free.

Bit.ly

You should register with a domain shortening service such as bit.ly, which will provide you with an API key that you can enter into applications you use to share links. This will provide a means to keep track of your click-through statistics in one location. Bit.ly records how many times a link you created was clicked on, the referrer and location of the user. Consider registering your own domain name and using it to shorten your tweets as a means of branding. In addition, you can use your custom link on electronic copies of your CV or at your own web site. This will inform you when your links have been clicked on. You should also consider using bit.ly to create links used at your web site, providing you with feedback on which are used the most often. For example, all of the links in this article were created using my custom bit.ly domain. In addition, you can tweet a link to any research study you publish to publicize as well as keep track of how many clicks are obtained. Bit.ly is a free service.

TweetReach

Another tool to measure your tweets is TweetReach. This service allows you to track the reach of your tweets by Twitter handle or tweet. It provides output in formats that can be saved for use elsewhere (Excel, PDF or the option to print or save your output by link). To use these latter features you must sign up for an account but the service is free.

Buffer

Buffer is a tool that allows you to schedule your tweets in advance. You can also connect Buffer to your bit.ly account so links used can be included in your overall analytics. Although Buffer provides its own measures on click-through counts this can contradict what appears in bit.ly. This service is free but also has paid upgrade options available that provide more detailed analytics.

Web presence

Google Scholar Citation Profile

You can set up a profile with Google Scholar based on your publication record. The metrics provided by this service include a citation count, h-index and i10-index. When someone searches your name using Google Scholar your profile will appear at the top before any of the citations. This provides a quick way to separate your articles from someone else who has the same name as you.

Google Feedburner for RSS feeds

If you maintain your own web site and use RSS feeds to announce new postings you can also collect statistics on how many times your article is clicked on. Feedburner, recently acquired by Google provides one way to measure this. You enter your RSS feed ULR and a report is generate, which can be saved in CVS format.

Journal article download statistics

Many journals provide statistics on the number of downloads of articles. Keep track of those associated with your publication by visiting the site. For example, BioMed Central (BMC) maintains an access count of the last 30 days, one year and all time for each of your publications.

Quora

Other means of contributing to the knowledge base in your field include participating on web-based forums or web sites such as Quora. Quora provides threaded discussions on topics and allows participants to both generate and respond to the question. Other users vote on your responses and points are accrued. If you want another user to answer your question you must “spend” some of your points. Providing a link to your public profile on Quora on your CV will demonstrate another form of contribution to your field.

Paper.li

Paper.li is a free service that curates content and renders it in a web-based format. The focus of my Paper.li is the use of technology in Canadian Healthcare. I have also created a page that appears at my web site. Metrics on the number of times your paper has been shared via Facebook, Twitter, Google + and Linked are available. This service is free.

Twylah

Twylah is similar to paper.li in that it takes content and displays it in a newspaper format except it uses your Twitter feed. There is an option to create a personalized page. I use tweets.lauraogrady.ca. I also have a Twylah widget at my web site that shows my trending tweets in a condensed magazine layout. It appears in the side bar. This free service does not yet provide metrics but can help increase your tweet reach. If you create a custom link for your Twylah page you can keep track of how many people visit it.

Analytics for your web site

Log file analysis

If you maintain your own web site you can use a variety of tools to capture and analyze its use. One of the most popular applications is Google Analytics. If you are using a content management system such as WordPress there are many plug-ins that will add the code to the pages at your site and produce reports. WordPress also provides a built-in analytic available through its dashboard.

If you have access to the raw log files you could use a shareware log file program or the open source tool Piwik. These tools will provide summaries about what pages of your site are visited most frequently, what countries the visitors come from, how long visitors remain at your site and what search terms are used to reach your site.

Summary

All of this information should be included in the annual report you prepare for your department and your tenure application. This will increase awareness of altmetrics and improve our ability to have these efforts “count” as contributions in your field.

June 24,2012

12:52
  1. The following provides a timeline of articles that appeared in newspapers and blogs from January 2011 to present. The articles demonstrate a progress from patient engagement in online communities to those that include reference to increasing provider involvement.
  2. January 5th, 2011
  3. February 3rd, 2011
  4. February 22nd, 2011
  5. March 23rd, 2011
  6. April 2nd, 2011
  7. April 25th, 2011
  8. May 14th, 2011

Follow Us: