Blog

Samara Rosenfeld 


Ordering gamma glutamyl transferase tests through the search engine function of an electronic health record (EHR) reduced orders from 36,000 to about 1,000 per month, according to the findings of a study published in the American Journal of Managed Care.

The research team found that ordering tests through the search engine function, rather than from two other lists that appear on the main screen of the EHR, led to a dramatic reduction in orders. The team returned the gamma glutamyl transferase test option to one of its original places on the main screen, which caused the numbers to spike to 18,000. When the test option returned to its original place in all the lists, the numbers jumped to more than 35,000.

The solution could lead to a reduction in costs.

Researchers set out to evaluate if changes in how laboratory requests are presented in the EHR would lead to less testing.

Gari Blumberg, M.D., from the family medicine department at Tel Aviv University, and the research team compared the numbers of gamma glutamyl transferase tests ordered at different times. The researchers changed the parameters on the main laboratory screen of the EHR.

Researchers at the laboratory at Leumit Health Services in Israel removed the testing option from the main screen in 2014. With the option removed, physicians could only order the test if it was specifically searched for.

After two months, the main screen option partially returned, then it went back to its original status.

When the gamma glutamyl transferase tests could only be ordered through the search engine function, Blumberg and the research team saw a 97.3% reduction in the number of orders.

While the number of test orders has increased since July 2015, less are ordered now compared to before the intervention. As of 2018, physicians ordered about 25 to 34 tests per 1,000 health maintenance organization members. Prior to the intervention, physicians ordered 51 tests per 1,000 members.

The study authors noted that there is a slight inconvenience when the test cannot be ordered on the main screen of the EHR. But using the search engine led to a dramatic decrease in the number of tests sent.

“Because the doctors are still able to choose the test should they feel it necessary by actively searching for it, it follows that the increased convenience was the most likely cause of the overordering, facilitated by the use of shortcuts,” the authors wrote.

The researchers wrote that while convenience is positive when it saves time, if it leads to overtesting, physicians do not gain much and are wasting money.


Kate Monica


Amidst concerns that EHR technology worsens provider burnout, one study showed 64 percent of clinicians say the systems have a positive effect on satisfaction levels.


New research suggests EHR technology may have a relatively positive reputation among healthcare professionals, with nearly 70 percent of surveyed providers reporting that EHR systems improve care quality.

This insight comes from a recent Future Health Index 2019 report commissioned by Philips.

Researchers surveyed 3,100 healthcare professionals and 15,000 healthcare consumers across 15 countries to gauge opinions of EHR technology in the current digitized care system.

While researchers found many healthcare professionals see the benefits of EHR technology and other health IT tools in clinical care, providers in most countries are not using health IT to its fullest potential.

For example, 80 percent of providers have engaged in health data exchange with other providers within their own care facility. However, only 32 percent of surveyed clinicians have shared patient health data with providers outside their facility.

Fifty-six percent of providers who do not share patient health data with outside hospitals and health systems lack the health IT infrastructure to do so. The lack of EHR interoperability between different provider systems restricts the scope of health data exchange for half of clinicians.

Fifty percent of providers also cited concerns over data privacy and security as an impediment to health data exchange with care facilities outside their health system.

In addition to this general lag in advanced health IT use among care professionals, many clinicians also struggle with EHR implementation and administrative burden.

“Many countries experience challenges with the implementation of digital health records and there is a common assumption that healthcare professionals feel these records can simply add administrative tasks to their workload,” wrote researchers in the report.

Health data exchange between patients and providers is similarly low. Only 36 percent of surveyed patients who use patient portals or other health IT regularly share their health information with their provider. Meanwhile, 26 percent of patients share health data with providers when they have a specific concern.

Despite these drawbacks, most surveyed healthcare providers agree EHR technology has had an overall positive impact on care quality.

Furthermore, 64 percent of surveyed providers said EHR technology has had a positive impact on provider satisfaction. Fifty-nine percent reported that EHR use has helped to boost patient health outcomes.

“Additionally, 57 percent of healthcare professionals report that, in the past five years, their experience has been positively impacted by having access to patients’ full medical history,” wrote researchers.

Patients who engage patient portals and other technologies to access and share their data also generally report higher levels of satisfaction.

“Those with access to their digital health record report better personal experiences in healthcare and better quality of care available to them than those who do not have access,” stated researchers.

Specifically, 82 percent of patients with access to their EHRs rate their experience with their providers as good, very good, or excellent. Comparatively, 66 percent of patients without access to their EHRs reported having a positive experience with their provider.

“The data suggests that there could be greater potential for individuals’ uptake of digital health technology and mobile health apps if usage of these technologies was more frequently recommended by healthcare professionals,” authors wrote in the report.

“There is also evidence to suggest that individuals will be more likely to use digital health technology if it’s easier to share data with their healthcare professional,” they added.

Overall, patients who access and exchange their own digital health information are more likely to have a positive perception of care quality.

“The challenge, now, is to encourage more individuals to share data with their healthcare professional, giving healthcare professionals access to more up-to-date and complete information that will allow for more coordinated patient care,” suggested researchers.


Mike Miliard


Even five years after go-live, many health systems aren't realizing the full value of their electronic health records, says a new Chartis Group report. Gaining clinical and financial ROI depends on a "sustained, organized approach."


Why aren't more hospitals realizing the benefits of their electronic health records? And what are the organizations that are capitalizing on their EHRs doing well that others should try?

Those are questions asked and answered in a new report from the Chartis Group, which surveyed some leading health systems that are leveraging their IT systems to enable big improvements in length of stay, reductions of adverse drug events, boosts in nursing efficiency, fewer unnecessary lab tests, speedier cash collections, better preventative care and more.

WHY IT MATTERS
The report, by Douglas Thompson and Tonya Edwards, MD, shows that even five years after attaining Stage 4 on the HIMSS EMR Adoption Model, where the benefits of improved clinical decision support should begin to show up system-wide, most hospitals still haven't fully realized them.

Moreover, "increased costs of operating more sophisticated EHRs leave some further behind financially, leading critics to claim that EHRs have been a huge waste of time and money."

But it doesn't have to be that way. Indeed, the report shows how many leading health systems have realized big ROI on their EHR investments with improved patient outcomes and cost efficiencies.

"Texas Health Resources  saved an estimated $10 million from a greater than 60 percent reduction in adverse drug events at three hospitals one year after EHR go-live," for instance. "Sentara realized $57 million in annual EHR-driven savings, net of expenses, and a 50 percent reduction in hospital mortality ratio. And Memorial Hermann saved over $2 million annually from increased use of just six standardized electronic order sets."

What are they doing right that other health systems aren't?

Illustrated by a series of anecdotes from those organizations and others, Thompson and Edwards show that too few hospitals understand that extracting lasting value from IT implementations requires a "sustained, organized approach," bolstered by a "firm commitment from business leaders."

And beyond mere technology, effective deployments depend on those health systems understanding the enterprise-wider cultural shift and specific process changes that will be necessary from clinicians and staff.

It's key, they said, to stay focused on "benefits realization amid the distractions of the design, build and implementation process," not just until go-live day, but aftward, "when the next big change initiative comes along."

Based on its experience with and review of several hundred hospitals nationwide, Chartis found that the most successful ones share some EHR best practices in common.

The health systems gaining the most from their technology investments are those who have bought their EHRs with an eye toward using them for specific strategic outcomes – and know how important it is to steer system implementation and optimization toward those goals.

Those hospitals also know that the most beneficial aspects of an EHR don't happen just by flipping a switch and running the system, but when the new tool is used to help change how day-to-day work is done. It's the difference between "automation" and "innovation," said Thompson and Edwards.

They also said that "without a formal structure, benefits realization is left to chance – and benefits don’t happen by accident," pointing to the value of "dedicated resources, well-defined roles and robust governance."

It's also critical to measure the system's benefit after go-live through quality indicators, financial data and other key performance indicators, to ensure its value is manifesting itself, they said. "If you don’t measure it, you won’t achieve it."

THE LARGER TREND
The value of a properly deployed electronic health record system is hard to argue with. And even if some hospitals are struggling to show ROI after Stage 4 of the HIMSS EMRAM, the advantages that can be gained by pushing higher up that ladder can be substantial.

We've shown, for instance, how Los Angeles-based Martin Luther King Jr. Community Hospital joined just 6.4 percent of other American hospitals at Stage 7 by treating that goal as a formal project, with a prep team and a designated project manager to lead the charge. By optimizing its EHR, the hospital was able to make big gains on an array of clinical use cases.

The Chartis Group report shows the value of approaching EHR rollouts strategically, and part of that is a robust and clinician-focused training program. As we showed this week, physician dissatisfaction and poor user experience have less to do with software design and much more to do with the quality of system training. The better the training, the better the care delivered and the outcomes reported.

ON THE RECORD
"Health system leaders should be satisfied with nothing less than achievement of the strategic clinical and business objectives of their technology investments," said Thompson and Edwards in the Chartis Group report.

"While the majority of health system EHRs have not delivered on that promise, there is ample evidence that with clear goals, careful planning, good governance, and ongoing measurement and commitment, any organization can expect real, substantial benefits from EHR use," they added. "Organizations that have already implemented or upgraded their EHRs can use the principles and methods described above to optimize their EHRs to deliver measurable benefits."


Nathan Eddy


Humanwide pilot collects data with mobile monitoring devices then pulls it into an EHR so the care team can help patients manage conditions.


A clinical trial program initiated by Stanford Medicine has deployed a data-driven, integrated team approach to predict and prevent disease and better detect overlooked health conditions and risks.

WHY IT MATTERS

The Humanwide pilot project uses science and technology to understand each patient, from lifestyle to DNA, and apply that knowledge to transform their health.

The organization’s model combines tools of biomedicine with a data-driven, team-based approach to focus on predicting and preventing disease before it strikes.

As part of the pilot, Humanwide patients used mobile monitoring devices, including a glucometer, pedometer, scale and blood pressure cuff, to regularly measure key health metrics.

The data automatically uploaded to their electronic health records for remote monitoring by their health care team, and the care team then helps the patient manage current health conditions and address future risks through a plan aligned with his or her personal goals.

ON THE RECORD

“With Humanwide, we’re able to focus on the whole human: who they are when they’re working, who they are when they’re playing, who they are when they’re at home,” Mahoney said in a statement. “This program demonstrates how we can zero in on what matters to a patient, to craft the entire care plan around their goals.”

EARLY LESSONS

In a paper in the Annals of Family Medicine, she and co-author Steven Asch outlined the early lessons of the year-long project, which involved 50 patients.

The paper noted encouraging the use of wearable devices in a healthy population helped identify multiple patients with early diabetes or hypertension, prompting early intervention and self-management.

The pilot participants underwent genetic assessments that gauged their risk for cancer and other diseases, and a pharmacogenomic evaluation that determined which types of drugs are most effective for their individual biology and cause the fewest side effects.

The patients also tracked key health metrics, such as blood glucose levels and blood pressure, using portable digital devices that beamed their readings back to their electronic health records for remote monitoring by care teams.

The teams, which included a primary care physician, nutritionist, behavioral health specialist and clinical pharmacist, used this data to inform each patient's care.

They also considered other factors, such as social and environmental determinants, and succeeded in identifying several previously overlooked health conditions and risks for different participants, from hypertension to heightened risk for breast cancer.

“Looking at genomic data and other factors that actually predict patient health allows us to be proactive instead of waiting for something to happen and having to react to that,” David Entwistle, president and CEO of Stanford Health Care, said in a statement. “Humanwide is an opportunity to build a deep understanding of each patient in a unique way.”


Joseph Goedert


Computer algorithms were used to analyze 29 clinical variables in UPMC’s electronic health record systems, and were able to recognize patients with sepsis within six hours of arrival.

But it took a lot of learning to reach this stage and be able to spot the signs of sepsis and the hidden subtypes of sepsis, say researchers at Pitt Health Sciences, part of University of Pittsburgh Medical Center.

“For over a decade there have been no major breakthroughs in the treatment of sepsis; the largest improvements we’ve seen involve the enforcing of ‘one size fits all’ protocols,” says study lead author Christopher Seymour, MD, an associate professor in Pitt’s department of critical care medicine. But these protocols ignore that sepsis patients are not all the same.”


In fact, use of algorithms have found four distinct sepsis types:

  • Alpha: the most common type (33 percent), comprising patients with the fewest abnormal laboratory test results, least organ dysfunction and lowest in-hospital death rate at 2 percent.
  • Beta: older patients comprising 27 percent with the most chronic illnesses and kidney dysfunction.
  • Gamma: similar frequency to beta but with elevated levels of inflammation and primarily pulmonary dysfunction.
  • Delta: the least common at 13 percent, but the most deadly type, often with liver dysfunction and shock, and the highest in-hospital death rate at 32 percent.

Having analyzed the clinical variables of 20,000 patients, researchers then studied the electronic health records of 43,000 other UPMC sepsis patients and the four findings held. The findings held again when the team studied rich clinical data and immune response biomarkers from about 500 pneumonia patients enrolled at 28 hospitals across the nation.

The next step was to apply their findings to recently completed international clinical trials that tested promising therapies, but results were unremarkable.

Sepsis recognition can be tricky, says Derek Angus, MD, senior author of the study and an associate professor in Pitts’ department of critical care medicine. Most doctors are not confused about a classic case of sepsis, but those are only a very small portion of all cases, meaning that in most other cases the recognition of sepsis is known only when it has become obvious and is too late to make the first correct treatment moves, Angus notes.

In an “early goal-directed therapy (EGDT),” an aggressive resuscitation protocol that includes placing a catheter to monitor blood pressure and oxygen levels, delivery of drugs fluids and blood transfusions was found to have no benefit following a five-year $8.4 million study. But when Seymour’s team-reexamined the results, they found that EGDT was beneficial for patients with the Alpha type of sepsis, but EGDT resulted in worse outcomes for those with the Delta subtype.

“Intuitively, this makes sense as you would not give all breast cancer patients the same treatment,” Angus explains. “Some breast cancers are more invasive and must be treated aggressively. Some are positive or negative for different biomarkers and respond to different medications. The next step is to do the same for sepsis that we have for cancer—to find therapies that apply to the specific types of sepsis and then new clinical trials to test them.”

That’s why it is imperative that patients have their vitals and labs captured upon arrival at the hospital, Seymour says. Sepsis requires the presence of organ disruption and six organs can be effected by the disease. Consequently, early treatment intervention should be done within 6 hours of suspected sepsis as the time window for capturing data at hospital presentation is 6 hours.

Capturing the vitals and labs early, with additional information available in the electronic health record, quickly helps physicians at the bedside to wrap their minds around the patient’s physiology. But now, physicians have another powerful tool at their disposal—machine learning technology.

Machine learning can find patterns that doctors cannot—much more than the three to four variables that doctors usually use. Data in the EHR can help doctors select variables to consider and then run machine learning models in collaboration with biostatisticians and computer scientists, says Seymour.

“We rely on doctors to find sepsis and quickly get patients on antibiotics, and we have machine learning and the EHR to parse out the type of sepsis,” he adds.


Samyukta Mullangi, John P. Pollak, Said Ibrahim


Health systems do not systematically collect information on social determinants of health (SDH) — the conditions in which people are born, live, grow, and age — despite knowing that they have a big impact on individual and population health. But the shift from reimbursing providers for the volume of services they deliver (fee for service) to the quality of patient outcomes relative to cost (value) is causing them to focus more on maintaining patients health and not just curing disease. This shift is causing providers to start investing in population health management strategies, which require them to better understand the local population and identify unmet needs.

The challenge is that the SDH information that physicians collect from patients and enter into their electronic medical records (EMRs) is pretty limited. Even though 83% of family physicians agree that the Institute of Medicine’s 2014 recommendation that they collect sociodemographic, psychological, and behavioral information from patients and put it into their EMRs, only 20% say they have the time to do so. But alternative means of collecting such information are emerging: smartphones, credit card transactions, and social media.

Smartphones. The Pew Research Center estimates that more than three-fourths of Americans now own smartphones. One example of how these devices could be used to collect SDH information involves the mobile applications that health systems offer to allow patients to easily book appointments or contact medical providers. These apps can also access information on patients’ location, which can be cross-referenced with rich databases like Foursquare’s book of local businesses or city-level heat maps on crime/domestic violence to understand a patient’s experience of his or her neighborhood — e.g., the availability of fresh food via local grocers or bodegas and the ability to exercise outside in relative safety. In a research setting, this type of location sharing has yielded startling insights.

In one interesting study on smoking cessation and relapses, patients’ location data, along with their self-reporting on their craving levels and smoking status, was overlaid on a point-of-sale tobacco outlet geodatabase to demonstrate that an individuals’ daily exposure to these retail outlets was significantly associated with lapses even when cravings were low. This real-time quantification about an individual’s interactions with her local environment unearthed novel influences on health behaviors that were likely invisible to the patient herself. This type of geolocation data is currently still being developed and tested in the research setting, but one day it may be used to make patients more aware of these triggers and resist unhealthy temptations.

Credit-card transactions. These are another goldmine of information that can help round out the medical record. For instance, a Gates Foundation- and United Nations Foundation-funded investigation into the economic, social, and health status of women in developing countries combined credit card records with records on their phone calls to identify patterns in people’s socioeconomic behaviors. The analysis resulted in six distinct lifestyle clusters in terms of expenditure patterns, age, mobility, and social networks. One can imagine that this type of aggregation can be useful as health systems increasingly work to tailor community and outreach programs to patients.

Credit-card statements do not contain the details necessary to generate insights ( i.e., what actual items make up a bill from the grocery store). That level of granular detail would go a long way into understanding whether patients fill their prescriptions, purchase cigarettes, or order salads. Some digital grocers (e.g., Instacart, Peapod), drug retailers (e.g., CVS, Walgreens), and payment kiosks (e.g., Square) are now emailing itemized receipts to consumers (with their consent). One group at Cornell Tech has created software tools that scrape these receipts and analyze purchases against a patient’s personal nutritional goals, a research effort with commercial potential. Such approaches not only collect information on SDH but also raise the patients’ level of awareness of the relationship between healthy behaviors and health itself.

Social media. Leveraging the willingness of people to divulge personal details on social media is yet another emerging frontierin the effort to collect SDH data. It is being used to successfully access populations that have historically been considered hard to reach: younger people, females, and low-income individuals. New features on popular sites like Facebook that allow individuals to mark themselves safe during natural disasters represent an initial foray to using this medium for gathering more SDH data. Health systems that engage patients via social media can elicit answers to questions around food insecurity, employment status, physical activity, and so on. In fact, new research suggests that many adult Facebook and Twitter users are willing to share their social media and medical data and link it with EMR data for research purposes.

Certainly, several pragmatic issues might create barriers to applying these approaches. An obvious one is privacy. More research will need to be done to ascertain patients’ comfort with novel ideas such as giving physicians access to their purchase histories or locations. It is also critical that the information gathered through these novel mechanisms not be used in a punitive manner but rather to inform clinician counseling and to support patients in their efforts to pursue healthy behaviors. Patients are not likely to share credit card or social media data, for example, if they perceive there to be a link between the information gathered and punitive responses such as the denial of insurance coverage or increased co-pays.

Another obstacle lies in the very act of obtaining consent from a large number of patients to participate in such information-gathering programs. One notable effort at Parkland Hospital in Dallas, which linked data about patients’ usage of food pantries, homeless shelters, and other social services with their medical records, found that patients were more willing to be enrolled into a digital database when asked to do so by community partners that had earned their trust rather than in the emergency room. Discouragingly, privacy concerns over the Trump administration’s policies tying social services usage with legal status has caused many undocumented immigrants to ask to be erased from social services’ IT systems.

Finally, it may be difficult to obtain buy-in from physicians who are already suffering from information overload. To overcome it, data will need to be turned into intelligent summaries with clear visuals and actionable takeaways. Additionally, clinics need to invest in support staff and ancillary services that help at-risk patients. For example, clinics can be outfitted with connections to community-based resources (housing programs, job training centers, and nutritional supplement programs). These investments will go a long way to ensuring that physicians are receptive to the work of monitoring additional data about SDH.

With these elements in place, health care systems will be able to harness digital technologies to identify the needs and interventions required to create healthier communities.

The authors wish to acknowledge Jessica Ancker for her critical review of this manuscript.



Heather Landi


An artificial intelligence tool can help diagnose post-traumatic stress disorder in veterans by analyzing their voices, a new study found.

Medical researchers and engineers designed an AI tool that can distinguish, with 89% accuracy, between the voices of those with or without PTSD, according to their study published Monday in Depression and Anxiety. The findings open up the possibility of using the AI-based voice analysis tool to diagnose PTSD more rapidly or through telemedicine.

“Our findings suggest that speech-based characteristics can be used to diagnose this disease, and with further refinement and validation, may be employed in the clinic in the near future,” senior study author Charles Marmar, M.D., from the department of psychiatry at NYU School of Medicine, said in a statement. A division of the U.S. Army supported the study.

The U.S. Department of Veterans Affairs reports that between 11% and 20% of veterans who served in operations in Iraq and Afghanistan have PTSD, while about 12% of Gulf War veterans have PTSD. Additionally, it is estimated that 30% of Vietnam veterans have had PTSD in their lifetimes.

The ability to improve PTSD diagnosis has wider implications, as more than 70% of adults worldwide experience a traumatic event at some point in their lives, with up to 12% of people in some struggling countries suffering from PTSD, according to the Sidran Institute.

According to researchers, the ability to accurately screen for and diagnose PTSD remains challenging. The diagnosis is usually based on clinical interviews or self-report measures. The gold standard for diagnosing the condition is the clinician-administered PTSD scale, a structured clinical interview to assess the frequency and severity of PTSD symptoms and related functional impairments. However, even that assessment is subject to biases. The interviews also require a lengthy visit to a clinician’s office, which some patients may be unwilling or unable to do.

An objective test is lacking, according to the researchers, who developed a classifier of PTSD based on objective speech-marker features that discriminate PTSD cases from controls. The research team included psychiatrists from New York University School of Medicine, Steven and Alexandra Cohen Veterans Center for the Study of Post-Traumatic Stress and Traumatic Brain Injury and engineers from SRI International, the institute that also invented Apple’s Siri feature.

For the study, researchers used speech samples from war zone-exposed veterans, 53 cases with PTSD and 78 controls, assessed with the clinician-administered PTSD Scale. Audio recordings of clinical interviews were used to obtain 40,526 speech features, which the team’s AI program sifted through for patterns.

The program linked patterns of specific voice features with PTSD, including less clear speech and a lifeless, metallic tone, both of which had long been reported anecdotally as helpful in diagnosis. 

The theory is that traumatic events change brain circuits that process emotion and muscle tone, which affects a person’s voice, according to researchers.

“We believe that our panel of voice markers represents a rich, multidimensional set of features which with further validation holds promise for developing an objective, low cost, noninvasive, and, given the ubiquity of smartphones, widely accessible tool for assessing PTSD in veteran, military, and civilian contexts,” the researchers said.

Other healthcare researchers are also exploring the use of voice analysis to detect and diagnose disease. A team at Mayo Clinic is exploring how to use AI-supported voice analysis as a noninvasive diagnostic tool to identify changes in tone or cadence that could potentially be predictive of an outcome, such as high blood pressure, stroke or heart attack.

The research team behind this latest study plans to train the AI voice tool with more data, further validate it on an independent sample and apply for government approval to use the tool clinically.


Samara Rosenfeld 


In the U.S., more than 7 million patients have undiagnosed Type 2 diabetes mellitus. But a recent study found that by using machine learning on data that already exist in a patient’s electronic health record (EHR), large populations of patients at high-risk of the condition can be predicted with 88% sensitivity.

What’s more, the machine learning model had a positive predictive value of 68.6%.

Chaitanya Mamillapalli, M.D., endocrinologist at Springfield Clinic in Illinois, and Shaun Tonstad, principal and software architect at Clarion Group in Illinois, along with their research team, aimed to evaluate a machine learning model to screen EHRs and identify potential patients with undiagnosed Type 2 diabetes mellitus.

Mamillapalli told Inside Digital Health™ that the team extracted data from an EHR at the Springfield Clinic. The data extracted was based on non-glucose parameters, including age, gender, race, body mass index, blood pressure, creatinine, triglycerides, family history of diabetes and tobacco use.

The team had an initial sample size of 618,022 subjects, but only 85,719 subjects had complete records.

After extracting the data, the subjects were equally split into training and validation datasets.

In the training group, the machine learning model was trained using the decision jungle binary classifier algorithm based on the parameters to learn if a subject is at-risk of diabetes.

The validation set classified the risk of the disease from the extracted non-glycemic parameters.

The validation subject probabilities were then compared to how the team defined Type 2 diabetes mellitus — random glucose greater than 140 mg/dL and/or HbA1c greater than 6.5%.

The predictive accuracy was also measured with area under the curve for the receiver operating characteristic curve and F1-score.

In the dataset, the model identified more than 23,000 true positives and 3,250 false negatives.

If the machine learning model is deployed in the back end of the EHR, physicians will be prompted if a patient’s health data shows that the patient is at high-risk of diabetes and should be screened, Mamillapalli said.

Mamillapalli said that patients generally go undiagnosed for four to six years before formally knowing that they have Type 2 diabetes mellitus. He told us that because of this, the patient is exposed to complications, which could cost up to $33 billion per year in the U.S.

But identifying the condition as early as possible could decrease the risk of complications.

However, screening rates for diabetes is still only at 50%.

In a written statement to Inside Digital Health™ from Mamillapalli, he wrote that “using an automated, scalable electronic model, we can deploy this tool to screen large chunks of the population.”

Mamillapalli said that the second phase of the team’s research is to change the algorithm slightly to diagnose prediabetes, which affects 90 million people, but is only diagnosed in 10% of that population.

“As the predictive accuracy is improved, this machine learning model may become a valuable tool to screen large populations of patients for undiagnosed (Type 2 diabetes mellitus),” the authors wrote.

The findings of the study, titled “Development and validation of a machine learning model to predict diabetes mellitus diagnoses in a multi-specialty clinical setting,” were presented at the American Association of Clinical Endocrinologists in California.


Bill Siwicki


The VEVA tool has helped improve caregiver productivity and efficiency – and providers appreciate that it reduces the number of clicks per query.


Vanderbilt University Medical Center in Nashville, Tennessee, is one of the largest academic medical centers in the Southeast, serving the primary and specialty healthcare needs for patients throughout Tennessee and the mid-south.

THE PROBLEM

Like many other healthcare organizations, Vanderbilt's caregivers have felt the administrative burden of clinical documentation and labor-intensive healthcare technologies. Caregivers found that the day-to-day practice of medicine was challenged by IT workflows that got in the way of, rather than improving, patient care.

Querying and entering patient information via keyboard and mouse, for example, proved to be an inefficient use of the caregivers' expertise and was taking them away from engaging with their patients at the bedside.

PROPOSAL

In 2011, when Apple debuted Siri, and in 2016, when the Amazon Echo became prolific, it also became clear that advances in artificial intelligence and natural language processing had matured to the point where communicating naturally with technology was no longer science-fiction, said Dr. Yaa Kumah-Crystal, core design advisor at Vanderbilt University and assistant professor of biomedical informatics and pediatric endocrinology at Vanderbilt University Medical Center.

"We knew we could leverage these technologies to entirely bypass the keyboard and mouse and instead empower our providers to use their voice to interact with the HER," she said.

Vanderbilt partnered with Nuance Communications at that point to develop a voice user interface prototype for the electronic health record. They called it the Vanderbilt EHR Voice Assistant, or VEVA.

This virtual tool enables caregivers to interact with the EHR using speech naturally. In this way, caregivers can easily retrieve information from the EHR to better understand the patient story when delivering care, she said. Today, VEVA linked with the EHR has been tested with more than 20 caregiver users.

MARKETPLACE

Vanderbilt's VEVA HER voice assistant is a homegrown technology, with the help of a vendor, Nuance. As a result, it is difficult to point to direct examples of other such products in the marketplace. But there are other vendors on the market with different types of voice assistant technology, such as IBM, Infor, Oracle, Salesforce and SAP.

MEETING THE CHALLENGE

In partnership with Nuance, Vanderbilt's team of medical informaticists, clinicians and software engineers used AI and natural language processing to build a voice interface prototype.

"Our providers can launch the VEVA application through the EHR in the context of a patient encounter," Kumah-Crystal explained. "FHIR resources are used to retrieve relevant content about the patient and render it back to the provider. For example, the provider can ask VEVA a question or say, 'Tell me about this patient.'

"VEVA applies its natural language understanding engine to translate that voice command into text and presents relevant results to the provider, such as a summary of the most recent patient visit," she said.

"VEVA does not only answer the question posed but, using natural language understanding, infers the intent behind the question and provides additional context."  Dr. Yaa Kumah-Crystal, Vanderbilt University Medical Center

Providers also can ask specific questions about recent diagnoses, lab test results and medications.

"VEVA does not only answer the question posed but, using natural language understanding, infers the intent behind the question and provides additional context," she noted. "If the provider asks about the patient's weight, for example, the system not only provides the current weight, but will also mention the previous weight, degree of change and other trend information. This information is presented as voice replies, textual information and on-screen graphics."

In other words, the VEVA assistant is designed to support busy caregivers and their workflows, diminishing the administrative and information retrieval burden of navigating unintuitive graphical user interfaces, she added.

RESULTS

By testing the voice assistant's functionality in the caregiver workflow, Vanderbilt caregivers have realized enhancements in the delivery of care. They are armed with efficient, simple ways to retrieve valuable patient information, which helps them better understand the patient's story in order to manage care, Kumah-Crystal said.

"VEVA has demonstrated the ability to impact on caregiver productivity and efficiency," she explained. "In particular, providers can save time by reducing the number of clicks per query, which translates to a 15 percent improvement in task-time savings."

Finally, VEVA rates well in system usability testing and has improved the providers' workflow experiences by enabling them to interact with the EHR using simple, intuitive, natural language queries, she added.

ADVICE FOR OTHERS

"We recommend that organizations start with a baseline assessment of both the amount of time providers spend on administrative tasks, as well as the impact this creates on workflows and caregiver productivity," Kumah-Crystal advised. "Without this baseline, it can be difficult to measure progress and success of the virtual assistant or any voice-powered technology."

Next, do the required homework and create a business case for workflow optimization, she said. Explore advances in machine learning, AI and natural language processing, and learn more about prototypes that are available and working in other organizations, she said. Find out about vendors that can leverage these technologies to make voice interaction with EHRs a reality, she added.

"A cross-functional team is essential as well," she suggested. "Include subject matter experts across a range of disciplines, including caregivers, clinical informaticists, software engineers and information theorists. Build an overarching model and make sure you take into consideration the providers' workflow, information needs and so on."

Additionally, build a prototype while users test and provide their feedback, she said.

"You want to understand information theory and map queries and content to satisfy provider's needs," she concluded. "Find out how you might 'break' the technology to uncover the commands it can't handle, for example, so you can overcome those before you make the solution widely available."


Samara Rosenfeld 


A new machine learning algorithm was highly accurate in determining whether a patient is likely to have a cholesterol-raising genetic disease that can cause early heart problems, according to the results of a study conducted by researchers at the Stanford University School of Medicine.

The algorithm was 88 percent accurate in identifying familial hypercholesterolemia (FH) in one data sample and 85 percent accurate in another.

In the study published in npj Digital Medicine, Joshua Knowles, M.D., Ph.D., assistant professor of cardiovascular medicine at Stanford, and his research team created an algorithm using data from Stanford’s FH clinic to learn what distinguishes an FH patient in an electronic health record (EHR).

The algorithm was trained to pick up on a combination of family history, current prescriptions, lipid levels, lab tests and more to understand what signals the disease.

The foundation of the algorithm was built using data from 197 patients who had FH and 6,590 patients who did not, so the program could learn the difference between positive and negative results.

When the algorithm was trained, the research team initially ran it on a set of roughly 70,000 new de-identified patient records. The team reviewed 100 patient charts from the patients flagged and found that the algorithm had detected patients who had FH with 88 percent accuracy.

Knowles and his partner, Nigam Shah, MBBS, Ph.D., associate professor of medicine and biomedical data science at Stanford, collaborated with Geisinger Healthcare System to further test the algorithm.

The algorithm was tested on 466 patients with FH and 5,000 patients without FH, and the predictions came back with 85 percent accuracy.

Shah said that him and Knowles knew that a lot of the Geisinger patients had a confirmed FH diagnosis with genetic sequencing.

“So that’s how we convinced ourselves that yes, this indeed works,” he said.

FH is an underdiagnosed genetic condition that leads to an increased risk of coronary artery disease if untreated. A patient with FH faces 10 times the risk of heart disease than someone with normal cholesterol. The condition can lead to death or a heart attack, and there are clear benefits of timely management, yet it is estimated that less than 10 percent of those with FH in the U.S. have been diagnosed.

Early diagnosis and treatment of FH can neutralize the threat of the condition. And one diagnosis could help multiple people because FH is genetic, making it likely that other relatives have it too.

Lead author Juan Banda, Ph.D., former research scientist at Stanford, wrote that when the algorithm is applied broadly to screen FH, it is possible to identify thousands of undiagnosed patients with the condition. This could lead to more effective therapy and screening of their families, Banda wrote.


Mike Miliard


From brain-computer interfaces to nanorobotics, a new report from Frost & Sullivan explores leading edge developments and disruptive tech.


A new study from Frost & Sullivan takes stock of some of the rapid-fire developments in the world of patient monitoring, which is expanding its capabilities by leaps and bounds with the maturation of sensors, artificial intelligence and predictive analytics.

WHY IT MATTERS
"Patient monitoring has evolved from ad hoc to continuous monitoring of multiple parameters, causing a surge in the amount of unprocessed and unorganized data available to clinicians for decision-making," according to F&S researchers. "To extract actionable information from this data, healthcare providers are turning to big data analytics and other analysis solutions.

The ability of such analytics to both assess patients in the moment and point toward their potential future condition had health systems investing more than $566 million in the technology during 2018, the report notes.

But data-crunching is only the beginning of what hospitals and healthcare providers will need to be prepared to manage in the years ahead if they hope to take full advantage of fast-evolving patient monitoring technology.

Wearables and embedded biosensors – such as continuous glucose monitors, blood pressure monitors, pulse oximeters and ECG monitors – are an obvious place to start, as health systems look to manage chronic conditions and population health, both in and out of the hospital.

But there's many more advances already starting to gain traction, such as smart prosthetics and smart implants. "These are crucial for patient management post-surgery or rehabilitation," researchers said, as "they help in measuring the key parameters to support monitoring and early intervention to avoid readmission or complexities."

Another innovation that's set for big growth is digital pills and nanorobots, which can help monitor medication adherence. In addition, advanced materials and smart fabrics are opening new frontiers in wound management and cardiac monitoring, the report notes. And brain-computer interfaces can enable direct monitoring and measurement of key health metrics to assess patients' psychological, emotional and cognitive state.

THE LARGER TREND
In a recent interview with Healthcare IT News, digital health pioneer Dr. Eric Topol, founder and director of Scripps Research Translational Institute, was asked which developments in AI and mobile technology he thought would be be most transformative in the year ahead.

"Longer term, the biggest thing of all is remote monitoring and getting rid of hospital rooms," said Topol. "And there, the opportunity is extraordinary. Because obviously the main cost in healthcare is personnel. And if you don't have hospital rooms, you have a whole lot less personnel. So setting up surveillance centers with remote monitoring – which can be exquisite and very inexpensive with the right algorithms, when it's validated – would be the biggest single way to improve things for patients, because they're in the comfort of their own home"

The value of patient monitoring is recognized at the federal level too. Centers for Medicare and Medicaid Services Administrator Seema Verma has called for expansion of reimbursement for remote care, with CMS seeking to "make sure home health agencies can leverage innovation to provide state-of-the-art care," she said.

ON THE RECORD
"In the future, patient monitoring data will be combined with concurrent streams from numerous other sensors, as almost every life function will be monitored and its data captured and stored," said said Sowmya Rajagopalan, global director of Frost & Sullivan's Advanced Medical Technologies division. "The data explosion can be harnessed and employed through technologies such as Artificial Intelligence (AI), machine learning, etc., to deliver targeted, outcome-based therapies."

Rajagopalan added that, "as mHealth rapidly gains traction, wearables, telehealth, social media and patient engagement are expected to find adoption among more than half of the population in developed economies by 2025. The patient monitoring market is expected to be worth more than $350 billion by 2025, as the focus is likely to move beyond device sales to solutions."


Samara Rosenfeld


Machine learning algorithms using administrative data can be valuable and feasible tools for more accurately identifying opioid overdose risk, according to a new study published in JAMA Network Open. 

Wei-Hsuan Lo-Ciganic, Ph.D., College of Pharmacy at the University of Florida, Gainesville, along with her research team, found that machine learning algorithms performed well for risk prediction and stratification of opioid overdose — especially in identifying low-risk subgroups with minimal risk of overdose.

Lo-Ciganic told Inside Digital Health™ that machine learning algorithms outperformed the traditional approach because the algorithms take into account more complex interactions and can identify hidden relationships that traditionally go unseen.

The researchers used prescription drug and medical claims for a 5 percent random sample of Medicare beneficiaries between January 2011 and December 2015. The team identified fee-for-service adult beneficiaries without cancer who were U.S. residents and received at least one opioid prescription during the study period.

The team compiled 268 opioid overdose predictor candidates, including total and mean daily morphine milligram equivalent, cumulative and continuous duration of opioid use and total number of opioid prescriptions overall and by active ingredient.

The cohort was randomly and equally divided into training, testing and validation samples. Prediction algorithms were developed and tested for opioid overdose using five commonly used machine-learning approaches: multivariate logistic regression, least absolute shrinkage and selection operator-type regression, random forest, gradient boosting machine and deep neural network.

Prediction performance was compared with the 2019 Centers for Medicare and Medicaid Services opioid safety measures, which are meant to identify high-risk individuals and opioid use behavior in Medicare recipients.

In order to find the extent to which patients who were predicted to be high-risk exhibited higher overdose rates compared with those predicted to be low-risk, the researchers compared the C-statistic and precision-recall curves across different method from the sample using the DeLong Test.

Low-risk patients had a predicted score below the optimized threshold, medium-risk had a score between the threshold and 10th percentile and high-risk patients were at the top 10th percentile of scores.

Based on the findings, the deep neural network and gradient boosting machine performed the best, with the deep neural network having a C-statistic of 0.91 and the gradient boosting machine having a C-statistic of 0.90.

With the gradient boost machine algorithm, 77.6 percent of the sample were categorized into low-risk, while 11.4 percent were medium-risk and 11 percent were high-risk. And with the deep neural network algorithm, 76.2 percent of people were predicted to be at low-risk, and 99.99 percent of those individuals did not have an overdose.

Lo-Ciganic said that with the promising results of the study, the next step would be to develop software to be incorporated into health systems — or an electronic health record — to see if the algorithms can be applied in real-world settings to help clinicians identify high-risk individuals.

Erin Dietsche


Peter Bak, CIO of Humber River Hospital in Toronto, highlighted his organization's digital efforts around systems automation, connectivity and an analytics-focused command center.

Humber River Hospital, which is situated in Toronto, Ontario, is eagerly integrating IT into various facets of its operations. In a phone interview, CIO Peter Bak highlighted a bit of the work the organization has done.

“Hospitals are generally not organizations that adopt change dramatically,” Bak said. In fact, most healthcare organizations struggle with it.

But that’s where he comes in. His role has “morphed into helping define a culture of innovation” at Humber.

In addition to having its information in electronic form, the hospital has enabled digital patient engagement. Humber also utilizes systems automation, which can result in safer and more efficient workflows. One example Bak cited is the use of robotic devices that can move around the hospital and deliver supplies.

Plus, the hospital emphasizes connectivity. This includes linking people to assets. “People need to find things around the hospital,” Bak said. Humber has used IT to help hospital staff find wheelchairs or other needed supplies.

The connectivity front also includes bridging the gap between care teams. Tools from telecommunications company Ascom have come in handy here. Humber uses Ascom’s platform to improve person-to-person communication and person-to-system communication. Staff members can use the Ascom solutions to talk to each other, and the platform also ensures certain alerts and alarms are going to the right clinicians.

Humber River Hospital’s latest development involves a command center it implemented a little more than a year ago. The point of it, Bak said, is to leverage electronic data to provide the hospital and its care teams with analytics and information. Regarding the center, Humber is currently working on analytics as it relates to eliminating never events.

At the end of the day, the Canadian hospital’s efforts tie back to giving patients the best outcomes and quality of care.

“I watch how we communicate in the consumer world, and yet in healthcare, we are not using these technologies,” Bak said. “[W]e’re all languishing in archaic methods of communicating. That leads to bad outcomes for patients.”



Shane Whitlatch 


Healthcare IT systems are becoming increasingly convoluted. More data, more connected devices and more regulations require more systems, which ideally can communicate and exchange data — not just within a healthcare organization but among organizations. This is the idea behind healthcare interoperability. According to HIMSS, interoperability is about the extent to which systems and devices can exchange data and interpret that shared data. For two systems to be interoperable, they must be able to exchange data and present that data so that a user understands the information and can use it in their treatment and operations decisions.

HIMSS goes on to describe three progressive levels of health IT interoperability. First is “foundational”interoperability. It enables one IT system to receive a data exchange from another and does not require the ability for the receiving information technology system to interpret the data.

The next step up is “structural” interoperability. It determines the structure or format of the data exchange (i.e., the message format standards) where there is uniform movement of healthcare data from one system to another. Structural interoperability ensures that data exchanges between information technology systems can be interpreted at the data field level.

The final and highest level is “semantic” interoperability. It this situation, two or more systems or elements can exchange and use information. Semantic interoperability takes advantage of both the data exchange structure and the codification of the data. This level of interoperability supports the electronic exchange of patient summary information among caregivers and other authorized parties via potentially disconnected electronic health record (EHR) systems and other networks to improve quality, safety, efficiency and efficacy of healthcare delivery.

Interoperability becomes optimal when it includes data processing and interpretation, with the goal of delivering actionable information to the end user, such as clinicians and the patients themselves.


Benefits of Interoperability

The premise of interoperability is making patient care and data safety better. Other goals include improved care coordination and experiences for patients, lowered healthcare costs and more robust public health data.

But how does interoperability accomplish these objectives? Here are the five key benefits of healthcare system interoperability through better information data exchange:


1. Greater patient safety

In this day and age, medical errors should be rare. A Johns Hopkins study determined that 44 percent of medical error deaths were preventable. By creating and implementing advanced interoperability, with the aim to capture and interpret data across systems and applications, healthcare organizations can better prevent errors due to missing or incomplete patient data. If an error does occur, advanced interoperability enables healthcare organizations to pinpoint the cause.

Healthcare providers might not be able to exchange data with external affiliates and systems even if they have excellent interoperability within their own enterprise. Lacking data on a patient’s vital signs and history — including allergies, medications or pre-existing conditions — healthcare organizations may be prone to fatal errors.

However, if these organizations can exchange and examine data, care providers can analyze the exact cause of a medical error to detect trends in the decision making leading up to the error. Once a pattern has been identified, healthcare organizations can begin remediating these issues to prevent future errors.


2. Improved patient experiences and coordination of care

The healthcare industry provides a stunning example of inefficiency in today’s digital world. The multiple providers who may be caring for a patient do not have their care coordinated. Patients must often do administrative tasks like search for documents, fill out multiple forms, re-explain their symptoms or medical history and sort out insurance (both before and often after receiving care). In fact, the Office of the National Coordinator for Health IT revealed research that suggests only 46 percent of hospitals had electronic access at the point of care to the patient information they required from outside providers or sources.

Interoperability can vastly improve this process, giving patients faster, more accurate and coordinated treatment and enhancing their overall experience.


3. More accurate public health data

Where interoperability is present, IT systems can interact in such a way that faster and more accurate collection and interpretation of public health data are possible. This can help organizations answer pressing questions for both patients and providers. The opioid crisis provides an excellent example of why healthcare needs more robust public health data to understand the scope of that problem and continue ways to more effectively address and resolve the crisis. By facilitating the sharing and interpretation of such data, interoperability allows healthcare organizations to collectively educate one another on predicting and preventing outbreaks.


4. Reduced costs and higher productivity

Improved care and hospital safety are outcomes of system interoperability. This ability to exchange data could save the U.S. healthcare system more than $30 billion a yearaccording to an estimate from the West Health Institute (WHI), which recently testified before Congress. Interoperability also gives organizations the opportunity to save time with every patient encounter by getting the right data to the patient, the provider and affiliate at the right time, every time.


5. Better patient privacy and security

Patient privacy and security are the primary care and regulatory issues to consider when implementing interoperability. This is not an easy task, but it can help enhance the privacy and security of patient data by requiring organizations to fully assess where their protected health information (PHI) resides and with whom it needs to be shared. When organizations enter data into systems that cannot communicate with one another, for example, it becomes difficult to track all systems that touch PHI, as required by the HIPAA Security Rule. It can be even tougher to track users with access to an EHR or affiliated applications: In a study of 1 million FairWarning users, 26 percent of users were found to be poorly known or unknown to the care provider.

By promoting the interoperability of human resource management systems such as Lawson or Peoplesoft with your EHR, though, you can better identify users, track their access and more effectively manage access rights. When PHI is entered into secure, interoperable systems, organizations gain a better idea of where their data live and who has access to it, helping them secure patient data and protect privacy.


Worth the Effort

The American Hospital Association, the Association of American Medical Colleges and several other organizations released a report in January that called for interoperability, arguing that it gives patients peace of mind because they know their providers’ decisions are based on the best, most complete information possible. Interoperability could form the foundation for a significantly improvement in both patient care and experiences. Healthcare processes would become streamlined. It takes work to achieve true healthcare systems interoperability, but it’s a worthwhile undertaking.

These providers have created diverse models of care that incorporate use of patient-centered technologies with measurable outcomes.

Jeff Lagasse


Hospitals and health systems across the U.S. are seeking ways to better engage patients with a variety of handheld and home-based technologies to improve patient experience and health outcomes.

This begs the questions: How does one use technology to transform the hospital bedside? Or to increase medication adherence for hypertension? What about controlling diabetes, or reducing distress in patients with cancer?

Four healthcare organizations, including UC San Diego Health, Ochsner Health System, Sutter Health and Stanford Health Care, have developed answers to these questions. All have created diverse models of care that incorporate use of patient-centered technologies with measurable outcomes, and these results were recently published in Health Affairs.

IMPACT

There are a few examples of these projects and the effects they've had.

Ochsner Health System used its online patient portal to help treat hypertension with a new digital medicine program that combined patient-reported blood pressure data, clinical data and coaching.

Outcomes showed that medication adherence among patients improved 14 percent, while 79 percent achieved greater blood pressure control. Overall, clinicians saw a 29 percent reduction in clinic visits.

Sutter Health used its patient portal to help patients self-manage their diabetes. Online reminders of hemoglobin A1c monitoring among patients with diabetes improved the rate of A1c test completion by 33.9 percent. Overall, patients with previously uncontrolled diabetes had a significant reduction in HbA1c at six months compared to usual care.

Stanford Health Care, meanwhile, used its patient portal to help patients with cancer manage stress. Patients were surveyed before clinic visits to identify unaddressed symptoms, and about 40 percent of those who responded reported experiencing distress. These responses led to more than 6,000 referrals for psychotherapy, nutrition and other services.

WHAT ELSE YOU SHOULD KNOW

In 2016, UC San Diego Health opened Jacobs Medical Center, a 245-bed hospital that offers advanced surgery, cancer care, cardiac rehabilitation, and birthing options. To put patients in direct control of their experience, an Apple tablet was placed in every patient room.

The tablets enabled patients to control room temperature, lighting and entertainment options from their beds. The tablets also enabled access to personal medical information, such as test results and schedules of medications or upcoming procedures. Photographs and biographies of their care team were also available.

What researchers found is that a big chunk of patients said the tablets were contributing to a positive patient experience, and engagement in medical care, determined by accessing their medical record, was higher.

THE TREND

Consumerism means customers have expectations of convenience, flexibility and ease of use. In healthcare, the customers are the patients, and healthcare organizations who want to attract business are increasingly looking to technological innovation to bring patients into the fold.

Aditya Bhasin, chief of web systems and vice president of software design and development at Stanford Health Care, spoke at HIMSS19 in February about his organization's attempts to do just that, emphasizing the importance of innovating from the inside -- to better create something that fits a provider's specific ecosystem.