Blog

Samara Rosenfeld 


A new machine learning algorithm was highly accurate in determining whether a patient is likely to have a cholesterol-raising genetic disease that can cause early heart problems, according to the results of a study conducted by researchers at the Stanford University School of Medicine.

The algorithm was 88 percent accurate in identifying familial hypercholesterolemia (FH) in one data sample and 85 percent accurate in another.

In the study published in npj Digital Medicine, Joshua Knowles, M.D., Ph.D., assistant professor of cardiovascular medicine at Stanford, and his research team created an algorithm using data from Stanford’s FH clinic to learn what distinguishes an FH patient in an electronic health record (EHR).

The algorithm was trained to pick up on a combination of family history, current prescriptions, lipid levels, lab tests and more to understand what signals the disease.

The foundation of the algorithm was built using data from 197 patients who had FH and 6,590 patients who did not, so the program could learn the difference between positive and negative results.

When the algorithm was trained, the research team initially ran it on a set of roughly 70,000 new de-identified patient records. The team reviewed 100 patient charts from the patients flagged and found that the algorithm had detected patients who had FH with 88 percent accuracy.

Knowles and his partner, Nigam Shah, MBBS, Ph.D., associate professor of medicine and biomedical data science at Stanford, collaborated with Geisinger Healthcare System to further test the algorithm.

The algorithm was tested on 466 patients with FH and 5,000 patients without FH, and the predictions came back with 85 percent accuracy.

Shah said that him and Knowles knew that a lot of the Geisinger patients had a confirmed FH diagnosis with genetic sequencing.

“So that’s how we convinced ourselves that yes, this indeed works,” he said.

FH is an underdiagnosed genetic condition that leads to an increased risk of coronary artery disease if untreated. A patient with FH faces 10 times the risk of heart disease than someone with normal cholesterol. The condition can lead to death or a heart attack, and there are clear benefits of timely management, yet it is estimated that less than 10 percent of those with FH in the U.S. have been diagnosed.

Early diagnosis and treatment of FH can neutralize the threat of the condition. And one diagnosis could help multiple people because FH is genetic, making it likely that other relatives have it too.

Lead author Juan Banda, Ph.D., former research scientist at Stanford, wrote that when the algorithm is applied broadly to screen FH, it is possible to identify thousands of undiagnosed patients with the condition. This could lead to more effective therapy and screening of their families, Banda wrote.


Mike Miliard


From brain-computer interfaces to nanorobotics, a new report from Frost & Sullivan explores leading edge developments and disruptive tech.


A new study from Frost & Sullivan takes stock of some of the rapid-fire developments in the world of patient monitoring, which is expanding its capabilities by leaps and bounds with the maturation of sensors, artificial intelligence and predictive analytics.

WHY IT MATTERS
"Patient monitoring has evolved from ad hoc to continuous monitoring of multiple parameters, causing a surge in the amount of unprocessed and unorganized data available to clinicians for decision-making," according to F&S researchers. "To extract actionable information from this data, healthcare providers are turning to big data analytics and other analysis solutions.

The ability of such analytics to both assess patients in the moment and point toward their potential future condition had health systems investing more than $566 million in the technology during 2018, the report notes.

But data-crunching is only the beginning of what hospitals and healthcare providers will need to be prepared to manage in the years ahead if they hope to take full advantage of fast-evolving patient monitoring technology.

Wearables and embedded biosensors – such as continuous glucose monitors, blood pressure monitors, pulse oximeters and ECG monitors – are an obvious place to start, as health systems look to manage chronic conditions and population health, both in and out of the hospital.

But there's many more advances already starting to gain traction, such as smart prosthetics and smart implants. "These are crucial for patient management post-surgery or rehabilitation," researchers said, as "they help in measuring the key parameters to support monitoring and early intervention to avoid readmission or complexities."

Another innovation that's set for big growth is digital pills and nanorobots, which can help monitor medication adherence. In addition, advanced materials and smart fabrics are opening new frontiers in wound management and cardiac monitoring, the report notes. And brain-computer interfaces can enable direct monitoring and measurement of key health metrics to assess patients' psychological, emotional and cognitive state.

THE LARGER TREND
In a recent interview with Healthcare IT News, digital health pioneer Dr. Eric Topol, founder and director of Scripps Research Translational Institute, was asked which developments in AI and mobile technology he thought would be be most transformative in the year ahead.

"Longer term, the biggest thing of all is remote monitoring and getting rid of hospital rooms," said Topol. "And there, the opportunity is extraordinary. Because obviously the main cost in healthcare is personnel. And if you don't have hospital rooms, you have a whole lot less personnel. So setting up surveillance centers with remote monitoring – which can be exquisite and very inexpensive with the right algorithms, when it's validated – would be the biggest single way to improve things for patients, because they're in the comfort of their own home"

The value of patient monitoring is recognized at the federal level too. Centers for Medicare and Medicaid Services Administrator Seema Verma has called for expansion of reimbursement for remote care, with CMS seeking to "make sure home health agencies can leverage innovation to provide state-of-the-art care," she said.

ON THE RECORD
"In the future, patient monitoring data will be combined with concurrent streams from numerous other sensors, as almost every life function will be monitored and its data captured and stored," said said Sowmya Rajagopalan, global director of Frost & Sullivan's Advanced Medical Technologies division. "The data explosion can be harnessed and employed through technologies such as Artificial Intelligence (AI), machine learning, etc., to deliver targeted, outcome-based therapies."

Rajagopalan added that, "as mHealth rapidly gains traction, wearables, telehealth, social media and patient engagement are expected to find adoption among more than half of the population in developed economies by 2025. The patient monitoring market is expected to be worth more than $350 billion by 2025, as the focus is likely to move beyond device sales to solutions."


Samara Rosenfeld


Machine learning algorithms using administrative data can be valuable and feasible tools for more accurately identifying opioid overdose risk, according to a new study published in JAMA Network Open. 

Wei-Hsuan Lo-Ciganic, Ph.D., College of Pharmacy at the University of Florida, Gainesville, along with her research team, found that machine learning algorithms performed well for risk prediction and stratification of opioid overdose — especially in identifying low-risk subgroups with minimal risk of overdose.

Lo-Ciganic told Inside Digital Health™ that machine learning algorithms outperformed the traditional approach because the algorithms take into account more complex interactions and can identify hidden relationships that traditionally go unseen.

The researchers used prescription drug and medical claims for a 5 percent random sample of Medicare beneficiaries between January 2011 and December 2015. The team identified fee-for-service adult beneficiaries without cancer who were U.S. residents and received at least one opioid prescription during the study period.

The team compiled 268 opioid overdose predictor candidates, including total and mean daily morphine milligram equivalent, cumulative and continuous duration of opioid use and total number of opioid prescriptions overall and by active ingredient.

The cohort was randomly and equally divided into training, testing and validation samples. Prediction algorithms were developed and tested for opioid overdose using five commonly used machine-learning approaches: multivariate logistic regression, least absolute shrinkage and selection operator-type regression, random forest, gradient boosting machine and deep neural network.

Prediction performance was compared with the 2019 Centers for Medicare and Medicaid Services opioid safety measures, which are meant to identify high-risk individuals and opioid use behavior in Medicare recipients.

In order to find the extent to which patients who were predicted to be high-risk exhibited higher overdose rates compared with those predicted to be low-risk, the researchers compared the C-statistic and precision-recall curves across different method from the sample using the DeLong Test.

Low-risk patients had a predicted score below the optimized threshold, medium-risk had a score between the threshold and 10th percentile and high-risk patients were at the top 10th percentile of scores.

Based on the findings, the deep neural network and gradient boosting machine performed the best, with the deep neural network having a C-statistic of 0.91 and the gradient boosting machine having a C-statistic of 0.90.

With the gradient boost machine algorithm, 77.6 percent of the sample were categorized into low-risk, while 11.4 percent were medium-risk and 11 percent were high-risk. And with the deep neural network algorithm, 76.2 percent of people were predicted to be at low-risk, and 99.99 percent of those individuals did not have an overdose.

Lo-Ciganic said that with the promising results of the study, the next step would be to develop software to be incorporated into health systems — or an electronic health record — to see if the algorithms can be applied in real-world settings to help clinicians identify high-risk individuals.

Erin Dietsche


Peter Bak, CIO of Humber River Hospital in Toronto, highlighted his organization's digital efforts around systems automation, connectivity and an analytics-focused command center.

Humber River Hospital, which is situated in Toronto, Ontario, is eagerly integrating IT into various facets of its operations. In a phone interview, CIO Peter Bak highlighted a bit of the work the organization has done.

“Hospitals are generally not organizations that adopt change dramatically,” Bak said. In fact, most healthcare organizations struggle with it.

But that’s where he comes in. His role has “morphed into helping define a culture of innovation” at Humber.

In addition to having its information in electronic form, the hospital has enabled digital patient engagement. Humber also utilizes systems automation, which can result in safer and more efficient workflows. One example Bak cited is the use of robotic devices that can move around the hospital and deliver supplies.

Plus, the hospital emphasizes connectivity. This includes linking people to assets. “People need to find things around the hospital,” Bak said. Humber has used IT to help hospital staff find wheelchairs or other needed supplies.

The connectivity front also includes bridging the gap between care teams. Tools from telecommunications company Ascom have come in handy here. Humber uses Ascom’s platform to improve person-to-person communication and person-to-system communication. Staff members can use the Ascom solutions to talk to each other, and the platform also ensures certain alerts and alarms are going to the right clinicians.

Humber River Hospital’s latest development involves a command center it implemented a little more than a year ago. The point of it, Bak said, is to leverage electronic data to provide the hospital and its care teams with analytics and information. Regarding the center, Humber is currently working on analytics as it relates to eliminating never events.

At the end of the day, the Canadian hospital’s efforts tie back to giving patients the best outcomes and quality of care.

“I watch how we communicate in the consumer world, and yet in healthcare, we are not using these technologies,” Bak said. “[W]e’re all languishing in archaic methods of communicating. That leads to bad outcomes for patients.”



Shane Whitlatch 


Healthcare IT systems are becoming increasingly convoluted. More data, more connected devices and more regulations require more systems, which ideally can communicate and exchange data — not just within a healthcare organization but among organizations. This is the idea behind healthcare interoperability. According to HIMSS, interoperability is about the extent to which systems and devices can exchange data and interpret that shared data. For two systems to be interoperable, they must be able to exchange data and present that data so that a user understands the information and can use it in their treatment and operations decisions.

HIMSS goes on to describe three progressive levels of health IT interoperability. First is “foundational”interoperability. It enables one IT system to receive a data exchange from another and does not require the ability for the receiving information technology system to interpret the data.

The next step up is “structural” interoperability. It determines the structure or format of the data exchange (i.e., the message format standards) where there is uniform movement of healthcare data from one system to another. Structural interoperability ensures that data exchanges between information technology systems can be interpreted at the data field level.

The final and highest level is “semantic” interoperability. It this situation, two or more systems or elements can exchange and use information. Semantic interoperability takes advantage of both the data exchange structure and the codification of the data. This level of interoperability supports the electronic exchange of patient summary information among caregivers and other authorized parties via potentially disconnected electronic health record (EHR) systems and other networks to improve quality, safety, efficiency and efficacy of healthcare delivery.

Interoperability becomes optimal when it includes data processing and interpretation, with the goal of delivering actionable information to the end user, such as clinicians and the patients themselves.


Benefits of Interoperability

The premise of interoperability is making patient care and data safety better. Other goals include improved care coordination and experiences for patients, lowered healthcare costs and more robust public health data.

But how does interoperability accomplish these objectives? Here are the five key benefits of healthcare system interoperability through better information data exchange:


1. Greater patient safety

In this day and age, medical errors should be rare. A Johns Hopkins study determined that 44 percent of medical error deaths were preventable. By creating and implementing advanced interoperability, with the aim to capture and interpret data across systems and applications, healthcare organizations can better prevent errors due to missing or incomplete patient data. If an error does occur, advanced interoperability enables healthcare organizations to pinpoint the cause.

Healthcare providers might not be able to exchange data with external affiliates and systems even if they have excellent interoperability within their own enterprise. Lacking data on a patient’s vital signs and history — including allergies, medications or pre-existing conditions — healthcare organizations may be prone to fatal errors.

However, if these organizations can exchange and examine data, care providers can analyze the exact cause of a medical error to detect trends in the decision making leading up to the error. Once a pattern has been identified, healthcare organizations can begin remediating these issues to prevent future errors.


2. Improved patient experiences and coordination of care

The healthcare industry provides a stunning example of inefficiency in today’s digital world. The multiple providers who may be caring for a patient do not have their care coordinated. Patients must often do administrative tasks like search for documents, fill out multiple forms, re-explain their symptoms or medical history and sort out insurance (both before and often after receiving care). In fact, the Office of the National Coordinator for Health IT revealed research that suggests only 46 percent of hospitals had electronic access at the point of care to the patient information they required from outside providers or sources.

Interoperability can vastly improve this process, giving patients faster, more accurate and coordinated treatment and enhancing their overall experience.


3. More accurate public health data

Where interoperability is present, IT systems can interact in such a way that faster and more accurate collection and interpretation of public health data are possible. This can help organizations answer pressing questions for both patients and providers. The opioid crisis provides an excellent example of why healthcare needs more robust public health data to understand the scope of that problem and continue ways to more effectively address and resolve the crisis. By facilitating the sharing and interpretation of such data, interoperability allows healthcare organizations to collectively educate one another on predicting and preventing outbreaks.


4. Reduced costs and higher productivity

Improved care and hospital safety are outcomes of system interoperability. This ability to exchange data could save the U.S. healthcare system more than $30 billion a yearaccording to an estimate from the West Health Institute (WHI), which recently testified before Congress. Interoperability also gives organizations the opportunity to save time with every patient encounter by getting the right data to the patient, the provider and affiliate at the right time, every time.


5. Better patient privacy and security

Patient privacy and security are the primary care and regulatory issues to consider when implementing interoperability. This is not an easy task, but it can help enhance the privacy and security of patient data by requiring organizations to fully assess where their protected health information (PHI) resides and with whom it needs to be shared. When organizations enter data into systems that cannot communicate with one another, for example, it becomes difficult to track all systems that touch PHI, as required by the HIPAA Security Rule. It can be even tougher to track users with access to an EHR or affiliated applications: In a study of 1 million FairWarning users, 26 percent of users were found to be poorly known or unknown to the care provider.

By promoting the interoperability of human resource management systems such as Lawson or Peoplesoft with your EHR, though, you can better identify users, track their access and more effectively manage access rights. When PHI is entered into secure, interoperable systems, organizations gain a better idea of where their data live and who has access to it, helping them secure patient data and protect privacy.


Worth the Effort

The American Hospital Association, the Association of American Medical Colleges and several other organizations released a report in January that called for interoperability, arguing that it gives patients peace of mind because they know their providers’ decisions are based on the best, most complete information possible. Interoperability could form the foundation for a significantly improvement in both patient care and experiences. Healthcare processes would become streamlined. It takes work to achieve true healthcare systems interoperability, but it’s a worthwhile undertaking.

These providers have created diverse models of care that incorporate use of patient-centered technologies with measurable outcomes.

Jeff Lagasse


Hospitals and health systems across the U.S. are seeking ways to better engage patients with a variety of handheld and home-based technologies to improve patient experience and health outcomes.

This begs the questions: How does one use technology to transform the hospital bedside? Or to increase medication adherence for hypertension? What about controlling diabetes, or reducing distress in patients with cancer?

Four healthcare organizations, including UC San Diego Health, Ochsner Health System, Sutter Health and Stanford Health Care, have developed answers to these questions. All have created diverse models of care that incorporate use of patient-centered technologies with measurable outcomes, and these results were recently published in Health Affairs.

IMPACT

There are a few examples of these projects and the effects they've had.

Ochsner Health System used its online patient portal to help treat hypertension with a new digital medicine program that combined patient-reported blood pressure data, clinical data and coaching.

Outcomes showed that medication adherence among patients improved 14 percent, while 79 percent achieved greater blood pressure control. Overall, clinicians saw a 29 percent reduction in clinic visits.

Sutter Health used its patient portal to help patients self-manage their diabetes. Online reminders of hemoglobin A1c monitoring among patients with diabetes improved the rate of A1c test completion by 33.9 percent. Overall, patients with previously uncontrolled diabetes had a significant reduction in HbA1c at six months compared to usual care.

Stanford Health Care, meanwhile, used its patient portal to help patients with cancer manage stress. Patients were surveyed before clinic visits to identify unaddressed symptoms, and about 40 percent of those who responded reported experiencing distress. These responses led to more than 6,000 referrals for psychotherapy, nutrition and other services.

WHAT ELSE YOU SHOULD KNOW

In 2016, UC San Diego Health opened Jacobs Medical Center, a 245-bed hospital that offers advanced surgery, cancer care, cardiac rehabilitation, and birthing options. To put patients in direct control of their experience, an Apple tablet was placed in every patient room.

The tablets enabled patients to control room temperature, lighting and entertainment options from their beds. The tablets also enabled access to personal medical information, such as test results and schedules of medications or upcoming procedures. Photographs and biographies of their care team were also available.

What researchers found is that a big chunk of patients said the tablets were contributing to a positive patient experience, and engagement in medical care, determined by accessing their medical record, was higher.

THE TREND

Consumerism means customers have expectations of convenience, flexibility and ease of use. In healthcare, the customers are the patients, and healthcare organizations who want to attract business are increasingly looking to technological innovation to bring patients into the fold.

Aditya Bhasin, chief of web systems and vice president of software design and development at Stanford Health Care, spoke at HIMSS19 in February about his organization's attempts to do just that, emphasizing the importance of innovating from the inside -- to better create something that fits a provider's specific ecosystem.

Physician informaticists adjusted workflow, optimized EHR documentation and built new tools to improve depression screening, diagnosis and management.


Mike Miliard


UCLA Health has earned a HIMSS Davies Award for its comprehensive efforts rethink the way it uses information and technology to screen for depression in primary care patients.

WHY IT MATTERS
In addition to its work on depression, UCLA also was recognized for other innovative applications of IT. It made adjustments to its electronic health record to save two million each year in denials, for instance, and improved appropriate red blood cell utilization thanks to a collaborative project among its hospitalists, nurses, transfusion staff and the IT department. Read about these use cases here.

But it's the new approach to depression screening that won particular attention and could impact the most patients. More than 300 million people suffer from depression; each year 26 percent of adults and 20 percent of children have a diagnosable behavioral health disorder, as HIMSS points out.

And with most Americans getting behavioral health care from their PCPs than from mental health specialists, many cases of depression remain undetected in primary care settings – including at UCLA Health, which recognized a suboptimal depression screening rate among adults seen in its primary care offices.

Among the many ways they health system used to address this was to have its physician informaticists work alongside its IT and operational teams to review and revise existing EHR workflows for depression risk screening – including the deployment of new tools, such as an online platform, "Behavioral Health Check-up," developed by the UCLA Division of Population Behavioral Health.

Building on the insights from a series of plan-do-study-act cycles, UCLA rolled out new clinical workflows, created new training programs and continuously monitored how the new tools and strategies were working. This led to a better universal depression screening rate; more accurate diagnosis and management of depression comorbidities; increased referrals to appropriate specialists; more discrete and trackable data and improved risk-adjusted coding and appropriate charge capture, officials said.

Even better, "the depression risk screening rate at UCLA Health increased four-fold within one year, with more than 70 percent of adult patients screened for depression in the primary care setting and more than 90 percent rate of completion of additional follow-up diagnostic evaluation when patients presented positive depression risk," according to HIMSS. "Additionally, UCLA Health is anticipating to recuperate over $80,000 of revenue as a result of the project implementation."

THE LARGER TREND
The HIMSS Davies Award of Excellence recognizes high-achieving health organizations that harness health information and technology to boost patient outcomes and value. Other winners this past year include Hospital for Special SurgeryCalifornia Correctional Health Care ServicesBanner HealthOpen Door Family Medical CentersTriHealth and Mercy Health.

ON THE RECORD
"Behavioral health is a critical component for risk-adjusting patients to ensure they are receiving the proper level of care," said Jonathan French, senior director of quality and patient safety initiatives at HIMSS. "UCLA Health has leveraged technology to ensure that its providers are receiving a clearer picture of their patient's mental health, which allows them to intervene more effectively and improve the overall health of their patients. For this, HIMSS is proud to recognize UCLA Health as a Davies Enterprise Award recipient."

"Because UCLA Health continuously seeks to optimize patient care using health information technology, the HIMSS Nicholas E. Davies Award of Excellence provided us an opportunity to recognize and highlight some of our recent successes," said Kevin Baldwin, informatics portfolio manager, UCLA Health. "We're thankful for the experience and look forward to continue applying health information technology to enhance our health care system."

With fewer than half of HHC clinicians having the ability to view patient data in an EHR, new research finds giving them the option could reduce medical errors. 

Nathan Eddy



Major gaps in communication exist between hospital and home health care (HHC) clinicians, which could lead to potentially deadly medical errors, a University of Colorado Anschutz Medical Campus study found.


WHY IT MATTERS


The study that concluded providing electronic health record access for HHC clinicians would be a promising solution to improve the quality of communication.


Although almost all (96 percent) indicated that internet-based access to a patient's hospital record would be at least somewhat useful, fewer than half reported having access to EHRs for referring hospitals or clinics.


Among respondents of the study, which surveyed 50 HHC nurses, managers, administrators and quality assurance clinicians, 60 percent reported receiving insufficient information to guide patient management in HHC, and 44 percent reported encountering problems related to inadequate patient information.


More than half of respondents (52 percent) indicated patient preparation to receive HHC was inadequate, with patient expectations frequently including extended-hours caregiving, housekeeping, and transportation--beyond the scope of HHC.


Respondents with EHR access for referring providers were less likely to encounter problems related to a lack of information (27 percent versus 57 percent without EHR access).


"We have heard of medication errors occurring between hospitals and home health care providers," the study's lead author Christine Jones, an assistant professor at the University of Colorado, said in a statement. "As a result, patients can receive the wrong medication or the wrong dose. Some home health providers don't get accurate information about how long to leave a urinary catheter or intravenous line in."



Nearly six in 10 (58 percent) of respondents said the recommendation of additional tests by hospital clinicians was the communication domain most frequently identified as insufficient.



Jones also noted additional studies have found extremely high rates of medication discrepancies (94 percent – 100 percent) when referring provider and HHC medications lists are compared, noting that if these issues are arising in Colorado, they could signify a national problem.


ON THE RECORD


"For hospitals and HHC agencies seeking strategies to improve communication, this study can provide targets for improvement," Jones said. "Future interventions to improve communication between the hospital and HHC should aim to improve preparation of patients and caregivers to ensure they know what to expect from HHC and to provide access to EHR information for HHC agencies.”


The study, published in the Journal of the American Medical Directors Association, suggested targeted education of hospital staff about what home health clinicians actually provide to patients and caregivers to avoid frustration.


Just 12 percent of respondents reported positive experiences when accessing the Colorado Regional Health Information Organization (CORHIO) about hospital admissions.



Bill Siwicki


Chicago's Rush University has been earned the HIMSS Davies Award Of Excellence for its work using health IT to treat veterans with post-traumatic stress disorder, more commonly known as PTSD.

The provider organization recognized that approximately 23 percent of U.S. veterans who served in Afghanistan and Iraq suffer from PTSD. Despite the availability of effective evidence-based treatments for PTSD, research has suggested that less than 20 percent of these veterans actually receive these interventions, HIMSS reported.

Furthermore, of the veterans who receive evidence-based PTSD treatments, close to 40 percent do not complete them and therefore do not receive adequate therapeutic doses. Consequently, it is important to identify ways to increase access to evidence-based PTSD treatments and to help veterans stay engaged in treatment so that they can complete their course of therapy, HIMSS said.

To address these critical needs, the "Road Home Program: Center for Veterans and Their Families" at Rush University Medical Center developed an intensive treatment program for veterans with PTSD. The three-week-long program offers a combination of evidence-based PTSD treatments and adjunctive services. Rush is one of the first health systems in the country to offer intensive PTSD treatment, so it was important to ensure that this novel PTSD treatment delivery method was effective, HIMSS explained.

Innovating with clinical data

The Road Home Program worked to address veterans not receiving adequate amounts of therapy by systematically capturing clinical data, including but not limited to PTSD and depression symptom severity at various time points.

This was performed while using existing technology available through the electronic medical records and survey tools. All data capture tools were designed with input from clinicians, researchers and system administrators to ensure that the collection of program data could be completed in short amounts of time to minimize any potential burden on clinicians. Moreover, the systems were designed so that captured data could be easily extracted and analyzed to assist with program evaluation, HIMSS said.

As a result, the Road Home Program was able to improve access to evidence-based treatments for veterans with PTSD. Ongoing, data-driven program evaluation led to continuous improvements in program effectiveness.

According to research published by Road Home clinicians, clinical outcomes from the three-week-long intensive program demonstrate that the intensive program is highly effective and that participation in the program leads to large reductions in PTSD symptoms. In addition, program completion rates are much higher (91.5 percent) compared to standard outpatient PTSD treatment. Veterans also report very high satisfaction with the program and would recommend it to their veteran peers, HIMSS reported.

The organization was able to standardize the Road Home Program data-capture system and share it with other academic medical centers who offer similar programs for veterans with PTSD. As a result, PTSD programs and clinical outcomes can be directly compared to ensure that the veterans served receive the highest quality care possible, HIMSS explained.

Since it first began to offer intensive treatment services in 2015, the Road Home Program has closely tracked clinical outcomes and patient satisfaction through custom flowsheets in the electronic medical records and external survey tools. By continuously analyzing its program-based data – such as veterans' PTSD symptom improvement over the course of the program and at short, medium and long-term follow-up time points – the Road Home Program has been able to make changes to further increase effectiveness, HIMSS said.

The Davies Award

The HIMSS Davies Award of Excellence recognizes outstanding achievement of organizations that have used health information technology to substantially improve patient outcomes and value. The HIMSS Davies Award of Excellence is the pinnacle of the HIMSS Value Recognition Program and highlights organizations promoting health information and technology-enabled improvements in patient and business outcomes through sharing evidence-driven best practices on implementation strategies, workflow design, change management and patient engagement.

"Rush is proud to receive the HIMSS Davies Award of Excellence, as striving for excellence is the bedrock of everything we do at Rush," said Dr. Larry Goodman, CEO of Rush University Medical Center and the Rush System. "While this award symbolizes information technology achievement and expertise, we are especially proud that it also reflects the collective efforts by clinical, operational, and business teams to drive technology-enabled improvements for our patients."



Jeff Lagasse


How an interdisciplinary kaizen group within CDC is charting a roadmap for future metrics to improve population health and provider satisfaction.


Addressing global infectious diseases has been an ongoing challenge. To tackle the issue, in 2018 the U.S. Department of Health and Human Services’ Centers for Disease Control and Prevention put together a Kaizen group consisting of an interdisciplinary collection of healthcare and IT professionals.

The group worked collaboratively to develop a roadmap and metrics for the future of clinical guidelines as they apply to electronic health records and infectious diseases.

Such an approach has several advantages — and a handful of drawbacks. That’s what attracted the attention of Steph Hoelscher, chief clinical analyst for the Office of Clinical Transformation at Texas Tech University Health Sciences Center’s School of Medicine in Lubbock, Texas.

Digitalizing these guidelines and algorithms would consist of creating them in a way that an EHR could accept them quickly from an outside source with minimum modification needed to the system.

“The goal of this would be to decrease guideline adoption time as well as improve both provider and informaticist satisfaction, not to mention improve overall population health,” said Hoelscher. “The process is still in its early stages and hopefully will move to larger scale testing within the next year.”

For the project, Hoelscher and her team looked to align their facilities with the CDC’s initiative, the Quadruple Aim, as well as the 21st Century Cures Act, in regards to clinician documentation burden.

She’ll discuss the implementation in more depth at the upcoming HIMSS19 annual conference in Orlando, Florida -- focusing on preparing an EHR for the future of clinical decision support, and bridging the gap until they get there.

“The process can be as complicated or simple as your development team allows for,” said Hoelscher. “Proper CDS development takes time, patience, evidence, subject matter experts committed to the project, and executive support.”

Facilities often lack the time and resources to properly develop a new process -- one that involves testing, reevaluation and maintenance. But for it to truly succeed, it has to be designed to stand the test of time, said Hoelscher.

That takes commitment, and with the constant changes of both medicine and technology, having CDS design that’s evidence-based, usable and safe can be a challenge.

“If you push hard for a strong CDS foundation, maintenance later on can be made much simpler,” said Hoelscher.

There are, of course, both pros and cons of clinical support in EHRs. First, the cons.

The limits of current technology and education are a big one. As fast as technology often moves, sometimes it’s just not fast enough; EHRs are often just not ready for the changes an organization may want to make today, and there have to be temporary bridges built in order to make it across the chasm.

And then there’s making a complex concept understandable to multiple levels of healthcare professionals.

“As with any maintenance cycle of a CDS project, quality education and often re-education needs to be a top priority,” said Hoelscher. “Staff and providers that do not ‘understand’ changes or new workflows, often succumb to frustration, and that’s what we are trying to avoid.”

Yet there are some pros as well. Hoelscher’s organization has integrated the potential for local disease detection into its EHR. With diseases like measles popping with some frequency as of late, it’s not enough to simply concentrate on Ebola and Zika.

“With that being said, an improved CDS process can possibly help you recognize the next Virus ‘X’ as well,” she said. “We are at the point where it’s not a matter of if, but when. If our systems can be enhanced enough to accept digitized algorithms from agencies such as the CDC in the future, the improvement in quicker detection and treatment of impacted patients could be profound, even life-saving.”

Hoelscher will share these thoughts and more at HIMSS19 annual conference in Orlando in a session entitled “Clinician Satisfaction: Digitalizing ID Clinical Guidelines,” at 3 p.m. on Tuesday, Feb. 12 in room W311E. 


Patrick Krause 


In order to grapple with spending increases, the U.S. healthcare industry is transforming the way physicians are compensated to provide care to patients from the current fee-for-service (FFS) model to one in which medical providers are paid a flat fee for servicing a defined group of patients.

This pivot in reimbursement is often discussed as far off. A survey by Numerof & Associates, a St. Louis, Missouri-based healthcare strategy consultancy, finds that most health organizations have been slow to make the shift: 54% receive less than 10% of their revenue from risk-based agreements.

However, as consumers become more sophisticated, and payers and health systems become more emboldened to wring out costs, we expect to see the shift in reimbursement models take off in the coming years. Provider groups that embrace this shift now have an opportunity to gain meaningful first-mover advantages.

Policy changes will further enhance the transition. Annual FFS pricing increases are set by the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA), and physicians treating Medicare patients will receive FFS raises of 0.5% this year and next. Then, from 2020 through 2024, there will be no automatic payment increases.

That’s not a recipe for growing revenues. But instead of fearing the transition to a value-based compensation model, physician groups that have the foresight and inclination to harness analytic tools, patient data, and standardized processes will have a leg up on other practices, positioning them to win new contracts with payers and systems, and capture more patient volumes.

A key to a profitable shift away from the traditional FFS reimbursement model is a practice’s ability to effectively price a procedure based on their understanding of the patient’s medical history and other risk factors, as well as an understanding of what it costs to perform the procedure, including consumables, implants, and the time the physician spends with the patient. Automated or programmatic data analysis can help.

A good place to begin is an analysis of the top procedures the group performs. Using data from the practice’s electronic medical records system to answer the following questions can help providers determine a competitive price for these procedures.  

As an example, an orthopedic group analyzing the pricing for a knee replacement might evaluate:

  • What is being spent on implants? Are there commonalities or preferences on implants across the group? If a standard implant system could be selected, the group could standardize procedures or timing, allowing it to negotiate for better prices with vendors and better price the time associated with the procedure.

  • What are the common complications associated with knee replacements?  Are there recommendations that can be given to patients to reduce complications and hospital readmissions and improve outcomes?

  • What are the facility charges?

With this level of data in hand, a group can develop a series of costs for procedures and an appropriate target profit per patient before negotiating a capitated rate with a health system or insurance company.

Physician practices that are making the move to this model now by partnering with or serving a healthcare system or insurance provider are significantly growing their businesses. In doing so, they exchange the potential upside of being able to charge for extra services for the ability to gain market share by treating significantly more patients.

Groups that do not have this level of sophistication or understanding of their data or how to deliver care in a cost-effective manner will be at a profound disadvantage and will likely miss the opportunity to participate in ACOs or narrow provider networks.

All this requires significant analysis of data that the practice can draw from its electronic health records system. Some practices will be able to undertake this analysis with the leadership and guidance of a controller and a billing team, while others may need assistance from a consultant. Currently, numerous data-focused healthcare information service businesses are helping physician groups sift through their patient data, procedure outcomes, and expenses to effectively price their services.

This approach can deliver better healthcare to patients at a lower price while helping efficient practices grow significantly. By using data and analytics in this way, healthcare providers and insurers have the same incentive to lower healthcare costs. We’re already seeing this approach work.

Kaiser Permanente, for example, owns both its own insurance and health systems, in which 95% of its 12.2 million members are covered on a capitated basis. All manner of practices can benefit from this approach, from dermatologists, orthopedic surgeons, gastroenterologists and ophthalmologists, to name just a few.

The Harvard Business Review writes that the value-based approach can trim waste from U.S. healthcare spending while also making physicians’ practices significantly more profitable: “Better products at lower costs generate higher value, which helps organizations achieve better market positions. Strategies based on that thinking have transformed other industries. We believe that they will do the same in healthcare. Population-based payment will play a critical role in helping care delivery groups make that leap.”

Whether physicians’ practices like it or not, this transition is taking place. It may take the next 15 to 20 years to get there, but it’s always good to be ahead of the curve.



Heather Landi


The All of Us Research Program, part of the National Institutes of Health (NIH), has launched the Fitbit Bring-Your-Own-Device (BYOD) project. Now, in addition to providing health information through surveys, electronic health records, and bio-samples, participants can choose to share data from their Fitbit accounts to help researchers make discoveries.

According to All of Us research program officials, the project is a key step for the program in integrating digital health technologies for data collection.

The All of Us Research Program, established by the White House in 2015, aims to advance precision medicine by studying the health data of 1 million diverse Americans over the next five years. One aim of the project is to include groups that have been historically underrepresented in research. As of September 2018, more than 110,000 people have registered with the program to begin the participant journey, and more than 60,000 have completed all elements of the core protocol.

The participants are sharing different types of information, including through surveys, access to their electronic health records and blood and urine samples. These data, stripped of obvious identifiers, will be accessible to researchers, whose findings may lead to more tailored treatments and prevention strategies in the future, according to program officials.

Digital health technologies, like mobile apps and wearable devices, can gather data outside of a hospital or clinic. This data includes information about physical activity, sleep, weight, heart rate, nutrition, and water intake, which can give researchers a more complete picture of participants’ health.” The All of Us Research Program is now gathering this data in addition to surveys, electronic health record information, physical measurements, and blood and urine samples, working to make the All of Usresource one of the largest and most diverse data sets of its kind for health research,” NIH officials said.

“Collecting real-world, real-time data through digital technologies will become a fundamental part of the program,” Eric Dishman, director of the All of Us Research Program, said in a statement. “This information, in combination with many other data types, will give us an unprecedented ability to better understand the impact of lifestyle and environment on health outcomes and, ultimately, develop better strategies for keeping people healthy in a very precise, individualized way.”

All of Us participants with any Fitbit device who wish to share Fitbit data with the program may log on to the All of Us participant portal at https://participant.joinallofus.organd visit the Sync Apps & Devices tab. Participants without Fitbit devices may also take part if they choose, by creating a free Fitbit account online and manually adding information to share with the program.

All of Us is developing additional plans to incorporate digital health technologies. A second project with Fitbit is expected to launch later in the year, NIH officials said, and this project will include providing devices to a limited number of All of Us participants who will be randomly invited to take part, to enable them to share wearable data with the program.

The All of Us research program plans to add connections to other devices and apps in the future to further expand data collection efforts and engage participants in new ways.

Fred Donovan

The University of Chicago Medicine was able to adjust hospital EHR use and educate doctors and nurses on how to reduce in-hospital sleep deprivation, thereby improving sleep for patients staying at the hospital.

The University of Chicago Medicine was able to adjust hospital EHR use and educate doctors and nurses on how to reduce in-hospital sleep deprivation, thereby improving sleep for patients staying at the hospital.

The changes were part of a study designed by UChicago Medicine researchers called Sleep for Inpatients: Empowering Staff to Act (SIESTA), which examined the effects of nighttime sleep interruptions on patients in the hospital environment and how to improve patient sleep.

For the study, SIESTA employed electronic “nudges” using the patients EHR to urge doctors and nurses to avoid sleep disruptions that have minimal value, such as waking patients at night to take vital signs or administering nonurgent medications.

“Efforts to improve patients’ sleep are not new, but they do not often stick because they rely on staff to remember to implement the changes,” said Dr Vineet Arora, a professor of medicine at the University of Chicago and the study’s lead author.

For the study, the researchers interviewed patients about sleep barriers. During the interviews, the researchers found that major barriers to sleep were taking vital signs, administering medications, and drawing blood during sleep hours.

The researchers also found out that doctors did not know how to change the default vital signs order for every four hours or how to batch-order morning blood draws at a time other than 4 am.

Based on the interviews, the researchers developed and integrated electronic nudges into the EHR and taught doctors and nurses about the sleep friendly tools in the system. Taken together, these changes reduced the number of unnecessary sleep interruptions.

The researchers published the results of the study in the January 2019 issue of the Journal of Hospital Medicine.

The one-year study focused on two 18-bed general medicine units at UChicago Medicine. Between March 2015 and March 2016, 1,083 patients were admitted either to the SIESTA-enhanced unit or to a standard hospital unit nearby.

Both units had doctors who were trained in the use of nighttime orders, but only the SIESTA unit had nurses who were trained to advocate for patients with the doctors.  

For the SIESTA unit, decisions by doctors and nurses not to take vital signs every four hours increased from 4 percent to 34 percent and sleep friendly timing of medication administration rose from 15 percent to 42 percent. Nighttime room entry decreased by 44 percent.

For the standard unit, decisions not to take vital signs every four hours increased from 3 percent to 22 percent, sleep friendly timing of medication administration increased from 12 percent to 28 percent.

As a result, patients in the SIESTA unit had six fewer nighttime room entries, four times fewer sleep disruptions for medication administration, and three times fewer sleep disruptions for vital signs.

The researchers concluded that adjustments to the EHR system along with doctor and nursing education significantly reduced the number of nighttime vital sign orders and led to better timing of nighttime administration of medications in both units.

However, the study found that having the nurses as patient champions helped to sustain the benefits of a sleep friendly environment in the SIESTA unit over time.  

Sara Ringer, a patient in the SIESTA unit, said that the changes enabled her to sleep more soundly. “As a frequently hospitalized patient, I am used to being woken up as often as every 1 to 2 hours. It never feels like your body has a chance to rest and heal. My last hospitalization at University of Chicago was one of the easiest I've had because the hospital staff made it possible for me to sleep.”

“This illustrates the importance of engaging both nurses and physicians to create sleep-friendly environments in hospitals,” concluded Arora.

The research was funded by the National Institute on Aging and the National Heart, Lung and Blood Institute.



Ashley Lyles


Simple intervention cut monitoring time by 17% in randomized trial


Electronic health record (EHR) alerts when a telemetry order exceeds the recommended duration contributed to a safe decline in cardiac monitoring in a cluster-randomized clinical trial.

The EHR notification cut telemetry monitoring by 8.7 hours per hospitalization compared with no notification (P=0.001), and there wasn't a significant variation in emergency calls (6.0% vs 5.6%, P=0.90) or urgent medical events between groups, reported Nader Najafi, MD, of the University of California San Francisco, and colleagues in JAMA Internal Medicine.

The effect on telemetry duration was "notably smaller" than seen in other multicomponent quality improvement interventions, Najafi's group wrote.

However, it "was achieved without a concomitant educational or audit and feedback campaign, without human resources dedicated to monitoring telemetry use, and without an increase in adverse events as measured by rapid-response or medical emergency activation," they noted, so it would be "less costly and more scalable."

The study assessed 1,021 patients. The intervention group had a mean age of 64.5 and were 45% women, while the control group had a mean age of 63.8 and were 46% women.

The 12 general medicine service health teams, four of which were hospitalist teams and eight of which were house-staff teams, were cluster randomized at the team level to get or not get pop-up alerts on their computer screen during order entry in daytime hours when a patient had an active telemetry order outside the ICU that didn't meet the American Heart Association indication-specific best practice standards (with a few local tweaks).

When physicians received a telemetry notification, they decided to stop telemetry monitoring 62% of the time, 7% of the time they disregarded the notification, 21% of the time they requested telemetry again, and 11% of the time physicians responded to the alert but continued with the current course, the investigators found.

The mean telemetry hours per hospitalization were 41.3 with the intervention versus 50.0 among controls, a reduction of 17%.

The investigators acknowledged the limitations of their work: The results might not generalize to other locations, as the study is based on a single medical facility. And, the suggestions for telemetry hours were partially based on local expert outlook, making them more lenient than national practice guidelines.

"Finally, the preintervention mean telemetry hours at the UCSF Medical Center general medicine service was already lower than the baseline in prior studies,which may have limited the effect size of this intervention," the researchers wrote.


Samara Rosenfeld


Recurrent neural networks (RNN) provided significantly better accuracy levels than the clinical reference tool in predicting severe complications during critical care after cardiothoracic surgery, a new study found.
 
Alexander Meyer, M.D., department of cardiothoracic and vascular surgery at German Heart Center Berlin, and his team used deep learning methods to predict several severe complications — mortality, renal failure with a need for renal replacement therapy and postoperative bleeding leading to operative revision — in post-cardiosurgical care in real time.

“For all tasks, the RNN approach provided significantly better accuracy levels than the respective clinical reference tool,” the researchers wrote.

Mortality was the most accurately predicted, scoring a 90 percent positive predictive value (PPV) and an 85 percent sensitivity score. Renal failure had an 87 percent PPV and 94 percent sensitivity score.

The deep machine learning method also showed area under the curve scores that surpassed clinical reference tools, especially soon after admission.
 
Of the data studied, postoperative bleeding was the most difficult method to predict, due to how accurate the predictions were for mortality and renal failure. Postoperative bleeding had a PPV of 87 percent and sensitivity of 74 percent.
 
The team studied electronic health record (EHR) data from 11,492 adults over the age of 18 years old who had undergone major open-heart surgery from January 2000 through December 2016 in a German tertiary care center for cardiovascular diseases.
 
Patients’ data sets were studied for the 24 hours after the initial study, and if any complication occurred, patients were labeled accordingly.
 
Researchers measured the accuracy and timeliness of the deep learning model’s forecasts and compared predictive quality to established standard-of-care clinical reference tools.
 

Meyer told Healthcare Analytics News™ that one of the major findings of this study was that the system developed outperformed all three pre-existing benchmarks. He added that it is possible to work on a real-time uncurated clinical data stream.

With this information, physicians in emergency care units can perform interventions immediately if a patient is experience complications.
 
“Health systems should openly embrace this technology and ideally try to make use of it,” Meyer said.
 
At the very least, health systems can try to get regulations and developments so that this technology can be used.
 
In a clinical setting, technology like this is difficult to implement and generally demands a financial incentive.
 
Hospitals can work with researchers and companies to push this technology forward and gain support from politicians to help provide financial means and ways to attain these tools.