Michael Walter

Machine learning models using radiomics can help radiologists classify renal cell carcinomas (RCCs), according to new findings published in the American Journal of Roentgenology.

“CT is gradually evolving into a useful imaging tool in renal mass differential diagnosis,” wrote Xue-Ying Sun, First Affiliated Hospital with Nanjing Medical University in China, and colleagues. “Several studies have reported that the use of enhancement threshold levels could help to distinguish RCC subtypes and discriminate RCC from benign oncocytoma with 77–84% accuracy. However, the differential diagnosis of renal mass-forming lesions is still difficult, and a variety of imaging findings have been described with different performance results reported.”

To see how machine learning can help improve such classification, the authors explored contrast-enhanced CT (CECT) scans showing 254 RCCs. A team of radiologists manually segmented lesions so that a full radiologic-radiomic analysis could be performed. A machine learning model was then trained to classify renal masses using a 10-fold cross-validation, and the models’ performance was compared to that of four veteran radiologists.

Overall, when differentiating clear cell RCCs (ccRCCs) from papillary RCCs (pRCCs) and chromophobe RCCs (chrRCCs), the four radiologists achieved a sensitivity that ranged from 73.7% to 96.8% and specificity that ranged from 48.4% to 71.9%. The team’s ML model had a sensitivity of 90% and specificity of 89.1% for that same scenario.

When differentiating ccRCCs from fat-poor angioleiomyolipomas and oncocytomas, the radiologists achieved a sensitivity that ranged from 73.7% to 96.8% and specificity that ranged from 52.8% to 88.9%. The ML model had a sensitivity of 86.3% and specificity of 83.3%.

Finally, when differentiating pRCCs and chrRCCs from fat-poor angioleiomyolipomas and oncocytomas, the radiologists achieved a sensitivity that ranged from 28.1% to 60.9% and a specificity that ranged from 75% to 88.9%. The ML model had a sensitivity of 73.4% and specificity of 91.7%.

“Our results show that routine expert-level radiologist interpretation of CT images had obviously large variances with relatively low accuracy in differentiation of benign and malignant solid renal masses, whereas the radiologic-radiomic ML approach provides an assessment of their ability to aid standardization of CECT interpretation,” the authors concluded. “Our radiologic-radiomic ML model, comprising quantitative radiomic features and a priori radiologic hallmarks that are different from a DL black-box algorithm, was able to significantly reduce the misclassification of renal mass lesions. Considering the interpretability of our radiologic-radiomic ML model, we believe that the radiologic-radiomic ML approach could be a potential adjunct to expert-level radiologist interpretation of CT images for improving interreader concordance and diagnostic performance in routine clinical assessment of renal masses.

A study found that an EHR “nudge” increased breast cancer screenings by 22 percent and colorectal screenings by 14 percent.

Christopher Jason

With doctors’ busy schedules, a “nudge” is needed to prompt medical assistants to set up and order cancer screenings for doctors to sign once they see the patient, according to a study published in JAMA Network Open.

Researchers at Penn Medicine that specialize in EHR nudges found a 22 percent increase in screening orders for breast cancer and a 14 percent increase for those treating colorectal cancer. Overall, 88 percent of the breast cancer patients and 82 percent of colorectal patients included in the study had a cancer screening ordered due to the nudges.

Although the percentage of cancer screenings increased, there were minimal changes in the rates of patients who followed through within one year and completed their screenings. Researchers concluded that further interventions may need to be targeted to patients to complete their screenings.

“Cancer screening involves both the clinician recommending and ordering it as well as the patient taking action to schedule and complete it. Our study found nudges can be very influential, but for cancer screening they likely need to be directed to both clinicians and patients,” said Mitesh Patel, MD, MBA, director of the Penn Medicine Nudge Unit and the senior author of the study.

The study of nearly 70,000 breast or colorectal cancer patients at 25 primary care practices looked at how doctors can use the EHR to increase the rate at which they screen patients for the disease.

 “Clinicians are increasingly being asked to do more with a fixed amount of time with a patient,” said Esther Hsiang, the study’s lead author. “By directing the intervention to medical assistants, this reduced the burden on busy clinicians to respond to alerts and instead gave them more time to have a discussion with their patients about screening.”

In the study, the nudge was directed only to medical assistants who could create orders for clinicians to review. The medical assistants would then inform the patients that they were eligible for cancer screening and should discuss screening with their clinician. Researchers targeted medical assistants specifically to account for physician burnout challenges and EHR complexity that often bogs down physicians.

This study design lessened the burden for physicians and encouraged patients to prioritize a discussing a cancer screening. However, due to the more arduous process for completing a screening order – patients usually have to schedule a second appointment – patients did not necessarily respond to the prompts.

“Once cancer screening is ordered, the patient still has to take several steps to complete it,” explained Patel. “That includes scheduling an appointment, sometimes conducting prep — such as bowel prep for a colonoscopy — and then going to the appointment. These several steps can add up to high hurdles, especially if patients have lower motivation to begin with. Future interventions should test ways to nudge patients to complete cancer screenings.”

Patel and his team are in the process of developing a new study to test nudges for both the clinicians and the patients to increase the likelihood of patients to follow through and complete their screenings. The researchers also want to branch out and gain more data from more than the two types of cancer that they initially focused on.

“Since EHRs are used by more than 90 percent of physicians, this is a really scalable approach,” Patel concluded. “It is likely that it could be successful for other types of screening.”

Care coordination and interoperability between health data strengthens levels of care and reduces healthcare costs.

Christopher Jason

Driving care coordination is essential to providing a quality patient experience, helping to tie together patient care at the many healthcare facilities she may visit. But limited health data and EHR interoperability can get in the way, limiting providers’ ability to access patient information from disparate facilities.

Interoperability enables care coordination to deliver a patient’s health data from multiple providers and specialists. With patients attending different hospitals and specialists, the need for interoperability between multiple providers is key. Coordinated care reduces healthcare costs by eliminating repetitive tests and procedures.

Strong EHR use is found at the primary care level. However, it is still a work in progress at acute and post-acute hospitals.

Care coordination crucial to cohesive primary care

Strong EHR use is key for better care coordination between primary care and behavioral health specialists, said researchers in a 2017 study published in the Journal of the American Board of Family Medicine.

The study found that 67 percent of individuals with behavioral health (BH) disorders do not receive the care that they need, but when their care is integrated into the primary care setting, that issue typically improves.

“Most patients with BH conditions, including children, are seen in medical settings, most commonly primary care (PC), presenting the need and opportunity to replace separated systems of care that do not adequately meet the needs of patients with integrated, ‘whole-person’ care,” the researchers explained.

Integrating and coordinating specialty care — in this case behavioral healthcare — into primary care relies on EHR use and interoperability. Interoperable systems allow providers to access valuable clinical information from other providers who have previously treated the patient.

“Establish standard processes and infrastructure necessary for your integrated care approach: workflows, protocols for scheduling and staffing, documentation procedures, and an integrated EHR,” the researchers recommended.

And ultimately, this will streamline patient care. Interoperable systems between specialty and primary care providers ensures the specialty provider understands the patient’s current health conditions and can make informed medical decisions.

For example, when specialty providers can access the patient’s complete medical history, they can avoid re-testing and ensure that the patient receives the best care right away.

“This allows the caregiver to quickly find information about that patient and who’s responsible for them,” Mobile Heartbeat Vice President Jamie Brasseal told “Providers can communicate with the appropriate colleagues — such as specialists or pharmacists or case mangers — very quickly, and without having to leave the patient’s bedside, or go search for that information at the nursing unit or in the EHR.”

Care coordination improving at acute care hospitals

Patients do not always receive acute or emergency healthcare in the same facility where they receive their primary care, which can create some data exchange challenges for acute care providers. With patient data stored in disparate systems, acute care providers can be left without critical information off of which they can base medical decisions.

In a recent survey from PointClickCare, 49 percent of acute care providers said they have very little ability to access or share patient data electronically, resulting in a struggle for providers.

“With better communication between the facilities, we would cut back on readmission and sending patients back to the ER and any sort of miscommunication,” said one participating hospital executive.

Reassuringly, many acute care hospitals are investing and focusing more on improving its data exchange efforts.

Seventy-three percent of acute care providers said they are putting a higher priority on implementing interoperable systems for transferring patients.

“Streamlining interoperability between systems creates huge opportunities for cost reduction, patient care improvement and reduced workloads for people on both ends of patient transfers,” researchers said.

“This type of health data exchange also helps improve the transparency of data between acute care and skilled nursing facilities, enabling a stronger relationship. And, it enables robust, population health capabilities that are scalable as the number of patients needing post-acute care grows.”

In a 2018 report from the ONC, 83 percent of hospitals that had the capabilities to send, receive, locate, and integrate patient health information from outside organizations into their EHR systems reported having the ability to access information electronically at the point of care.

“This is at least 20 percent higher than hospitals that engage in three domains and almost seven times higher than hospitals that don’t engage in any domain,” wrote Don Rucker, MD, national coordinator for health IT and Talisha Searcy, director of research and evaluation.

Educating the staff and providers on EHR use and information exchange will benefit the team in the long run and provide better care for patients in acute care

Promoting better EHR adoption in post-acute care

Interoperability challenges can follow patients and providers out of the hospital and into the rehabilitation process. Provider access to information about a patient’s acute hospital stay will be integral to quality post-acute care, but many providers see bumps along the road.

That same PointClickCare survey revealed that 84 percent of post-acute care organizations are still using at least some manual processes to exchange patient data with acute care hospitals. Organizations relying on fax, email, and paper-based solutions to exchange patient data could encounter mistakes, mismatched patient data, or omissions that could seriously hinder patient care.

But the Centers for Medicare & Medicaid Services (CMS) is working to address that gap.

After prompting nearly universal EHR adoption in acute care facilities, CMS is promoting widespread EHR adoption in post-acute care (PAC) settings.

In March 2019, the federal agency released a request for information (RFI) seeking input about the best ways to incentivize EHR adoption and use among providers in the post-acute setting

“PAC facilities are critical in the care of patients’ post-hospital discharge and can be a determining step in the health progress for those patients,” stated CMS in the RFI.

“Interoperable health IT can improve the ability of these facilities to coordinate and provide care; however, long-term care and PAC providers, such as nursing homes, home health agencies, long-term care providers, and others, were not eligible for the EHR Incentive Programs under the HITECH Act,” the federal agency explained.

CMS partly attributes the slow rate of EHR adoption in PAC settings to the lack of federal incentives available to PAC providers.

Nearly 65 percent of skilled nursing facilities used an EHR system in 2016, but rates of health data exchange remained low among this population of providers. Only 30 percent of skilled nursing facilities participated in health data exchange, and only seven percent had the ability to locate and integrate patient health data into patient EHRs.

The inconsistency between rates of EHR adoption in acute and ambulatory care settings and PAC facilities partly contributes to problems with transitions of care.

“For PAC facilities that do possess EHRs, vendor adoption of interoperable functionality has been slow and uneven,” stated CMS.

As the medical industry continues to become increasingly digital and complex, it will be essential for disparate organizations to have systems for exchanging data. Interoperable tools will help drive care coordination between primary care providers, specialists, and acute and post-acute care organizations. And in doing so, clinicians can work to drive whole-person health and efficient, quality care.

Stephen Lawless

When you bring your loved one to the hospital, you expect them to get better, not worse.

But too often, we are failing at this crucial task. Too often, we hear about a patient admitted to the hospital who is seemingly doing fine, and then suddenly goes downhill. The question is “How did they get worse right under our eyes?”

How do we prevent someone from getting much sicker without us even realizing it?

Sepsis kills almost 5,000 children annually in the U.S.—more than cancer—and costs about $7.3 billion for hospitalizations alone. This huge and growing burden is now the most expensive cause of hospitalization in the U.S., with a high fatality rate that makes early recognition of patient instability absolutely critical. 

Innovation has finally caught up to this age-old issue by harnessing the power of predictive analytics. Three years ago, at Nemours Children’s Health System, a multidisciplinary system-wide team built a sepsis response tool that capitalizes on the health system’s technological resources.

Proprietary scoring criteria is built into the electronic health records to predict patient downturns before they happen. These stats are monitored by paramedics running the health system’s Clinical Logistics Center, a virtual command post that monitors every child seeking inpatient care at our free-standing children’s hospitals in Florida and Delaware.

Like air-traffic controllers peering into multiple video monitors, our team of paramedics and emergency nurses closely tracks color-coded vital signs in green, yellow or red to detect subtle changes in biomarkers that predict whether a patient is stable, declining or needs immediate attention. They triage alarms and can instantaneously initiate a rapid response team or even tap into a high-resolution audio/video connection, available in every room, to provide instant virtual care.

Machine learning and sophisticated algorithms that enable us to practice predictive analytics are not just aimed at speeding up our response to alarms. Our Clinical Logistics Center creates a smart support system that eases the alarm fatigue of nursing staff, acts as a fail-safe for patient care and can be a valuable planning tool to anticipate critical staffing needs in advance.

Nowadays, America’s hospitals have sicker patients on the general floors, patients who 20 years ago would have been in the ICU. Many of them are existing in what one expert calls “a precarious state of pseudo-stability,” and most hospitals are unprepared when they unexpectedly deteriorate and need instant, life-saving therapies. Without rapid intervention, patients who go into septic shock have an overall mortality rate of more than 50 percent.

Since we set up our response system at Nemours, we have had no unexpected deaths due to sepsis, largely because no alarms go unanswered for more than 90 seconds and no patients can suffer a severe downturn without staff being quickly alerted. Overall, we have reduced medication errors through decision supports, improved patient and nurse satisfaction rates, and, most importantly, we have dramatically lowered the frequency of sepsis from 2 percent to .05 percent.

Recently, we were honored to be the only pediatric health system invited by the Centers for Medicare and Medicaid Services to participate in a national sepsis “listening session” among subject-matter experts and leaders in the fields of innovation, care delivery reform and implementation science. CMS’s initiative is a most welcome development in promoting early identification of high-risk sepsis patients, speeding care delivery, and enhancing nutrition, mobility and other measures to improve quality of care.

Stakeholders in the fight against sepsis are encouraged by the emerging possibilities for using “big data” and artificial intelligence. CMS heard pleas for more funding towards awards and prizes that would foster these and other innovations. Participants called for raising community awareness and improving coordination between first responders and emergency departments.  

With the backing of federal, state and local health officials, and the willingness to promote coalition-building, we can replicate and expand upon the efforts that we and other healthcare systems have launched. We can create a better system that can be a model for fighting serious diseases and for saving lives.

Samara Rosenfeld

new imaging tool created by the National Institutes of Health (NIH)-led Brain Research through Advancing Innovative Neurotechnologies (BRAIN) makes it possible to capture images of more protein targets faster than traditional methods.

The imaging tool allows researchers to view dozens of proteins in a single tissue sample with thousands of neural connections.

The tool produces a rainbow of images, each one capturing different proteins with the complex network of synapses. The proteins could be present in different amounts and locations in a network.

“Such findings may shed light on key differences among synapses, as well as provide new clues into the roles that synaptic proteins may play in schizophrenia and various other neurological disorders,” wrote Francis Collins, M.D., Ph.D., director of NIH.

Researchers at Massachusetts Institute of Technology and Harvard University adapted an existing imaging method called DNA PAINT to better observe working synaptic proteins — something that has often presented many obstacles for researchers.

The adapted method is called PRISM (Probe-based Imaging for Sequential Multiplexing).

Researchers labeled proteins and molecules using antibodies that recognize the proteins. The antibodies include a DNA probe to help make the proteins visible through a microscope.

In DNA PAINT, strands of DNA bind and unbind to create a blinking fluorescence captured using super-resolution microscopy. This method, researchers said, is very slow.

To overcome this, the research team altered the DNA probes using synthetic DNA designed to bind more tightly to the antibody.

PRISM helped researchers go through the imaging process more quickly, though the resolution is slightly lower. While the research team currently captures 12 proteins in a sample in about an hour, it is possible the number could increase to 30, the researchers reported in the journal Nature Communications.

“PRISM will help (researchers) learn more mechanistically about the inner workings of synapses and how they contribute to a range of neurological conditions,” Collins wrote.

Mark Byers

The Department of Veterans Affairs (VA) is going through a time of great transformation. Much of this change is being driven by new models of healthcare delivery, the transition to value-based care, new mandates for federal agencies to modernize legacy systems, emerging innovations, as well as the new electronic health record modernization (EHRM) program.

Many of these changes point to the enhanced need to standardize proven technologies and processes to reduce the variance of care across the Veterans Health Administration (VHA), bring about best practices to each VA medical center and ensure that veteran care does not lag during the later facility deployments of the EHRM program.

By putting into place a system to identify and evaluate best of breed IT solutions across the continuum of VA medical care, the VA will be able to further improve patient care at all VA medical facilities and improve the employee experience by streamlining workflow and increasing productivity and efficiency. The VA will also receive the benefit of improved contract provisions and data standardization.

One of the most significant challenges is the transition of specialty health care applications within the current VistA ecosystem in both the short- and long-term as progress is made regarding the migration to the new, commercial-off-the-shelf EHRM. This is an important step in the process over the next 10 years and beyond.

In addition, as medical technologies continue to evolve at an exponential pace, we must ensure that veterans have access to the same quality of care that is delivered in the private sector – the pace of improvements in medical care will not stand still during the deployment of the new VA EHR.

Why the VistA transition is vital

Since its deployment at the VA in 1994, VistA has evolved into a technically complex system comprised of approximately 200 modules that support health care delivery at more than 1,500 sites of care, including each Veterans Affairs Medical Center (VAMC), Community Based Outpatient Clinics (CBOC) and Community Living Centers (CLC), as well as at nearly 300 VA Veteran Centers.

That makes it essential for the VA to establish a process to review these diverse and valuable applications that provide the interface of patient care, and upon which VA medical professionals rely on to offer safe and reliable healthcare for our nation’s veterans.

As the VA transitions to the new EHR, there is also a need for a similar systematic evaluation and transition of the many modules and applications currently integrated within the VistA environment – including the evaluation of critical business intelligence regarding unique VA requirements. This will help with the overall migration to the new EHR and determine which applications will be developed by the EHR contractor.

Many of these VA medical applications are also already compatible with the Cerner EHR platforms in the commercial health care sector and have proven quality measures outcomes and a cost benefit analysis to healthcare systems.

Evaluating and standardizing existing, best of breed, health care applications within the VistA system can lead to a smoother transition to the new EHR for VA health care providers, reduce cost as these systems are already in place and functioning, maximize dollars already spent and mitigate unnecessary risk in an already highly complex transition.

Standardization reduces risk, ensures quality care

As the VA begins to execute its systematic review of healthcare applications that are in use today, this will shine a light on turn-key modules that are tailored for VA workflows, as well as enhance an all-inclusive integration, continuous enhancements, maintenance and customer support. These improve the safety and quality of services and medical care that is delivered to our nation’s veterans now and into the future.

In addition, the VA would benefit from an analysis of alternatives through the best of breed applications available. Thankfully, the current veteran-focused integration process with all solutions being technical reference model (TRM) and enterprise technical architecture (ETA) compliant is helping.

This can be also achieved through applications that offer flexible and extensible systems of engagement; are standardized to enhance efficiencies; maintain clinician productivity; ensure ongoing veterans access to care; and lower total cost of ownership for the VA.

For example, there are commercially proven software applications that enable electronic clinical surveillance for infectious disease prevention and clinical pharmacy, including opiates. This software solution called TheraDoc provides the ability for immediate medical intervention, which currently is challenging due to the VA’s huge and dispersed population of patients – both inpatient and outpatient.

The Miami VA healthcare facility is also using the LiveData PeriOp Manager to help synchronize surgical scheduling, increasing access to care to 1.8 additional cases per day. This solution improves the patient’s journey from surgical consultation through preoperative steps to the scheduled day-of-surgery and discharge.

Both of the commercial systems highlighted above have been integrated into the VA’s current system, VistA, through the expertise of DSS, Inc.


By standardizing modern and proven solutions across all VA medical centers, it is possible to increase veteran access to care, enhance the veteran experience, and improve the VA employee experience. This also reduces risk during deployment of modernization effort – allowing the VA to effectively manage a time of great transition.

Sara Heath

Nearly one-quarter of patients would opt into data sharing for all of their information with any interested precision medicine research party.

Patients approve of data sharing and are willing to contribute their medical information to research projects, but according to a group of researchers from the University of California San Diego, there may be some strings attached.

These findings come in the context of the precision medicine and All of Us campaigns, which call for the use of patient data repositories to create targeted treatment approaches to improve care quality. Precision medicine efforts rely on patients being will to share data with medical researchers.

“The finding in this study that most patients were willing to share data from their EHRs and biospecimens with researchers is reassuring,” the researchers wrote. “Not only can biomedical research benefit from these resources but also a multisite learning health care system can continuously advance as a result of data-driven improvements to processes and associated outcomes.”

Overall, patients are willing to participate in precision medicine, but there are some caveats, the UCSD researchers reported in JAMA Open Network. A survey of over 1,200 patients revealed that most are willing to share at least some of their medical information with some interested research groups.

Patients filled out one of four surveys: a simple opt-in survey, a simple opt-out survey, a detailed opt-in survey, and a detailed opt-out survey.

The difference between simple and detailed surveys was the amount of data categories for which the patients outlined their data sharing preferences. Patients completing a simple survey had to opt into or out of sharing in 18 data categories, compared to 59 categories in a detailed survey.

Each survey also asked patients about with which types of researchers they would be willing to share individual survey items, including other researchers within their home organization, researchers at another non-profit organization, and researchers at for-profit organizations.

Overall, 67 percent of all survey respondents said they’d be willing to share all of their data with their own healthcare organizations, and 23.4 percent said they’d share all of their information with any interested research party.

This is good news for the precision medicine movement, which relies on a breadth of patient information for actionable insights, said study senior author Lucila Ohno-Machado, MD, PhD.

“These results are important because data from a single institution is often insufficient to achieve statistical significance in research findings,” Machado, who is also professor of medicine, associate dean for informatics and technology in the UC San Diego School of Medicine and chair of the Department of Biomedical Informatics at UC San Diego Health, said in a statement.

“When sample sizes are small, it is unclear whether the research findings generalize to a larger population. Additionally, in alignment with the concept of personalized medicine, it is important to see whether it is possible to personalize privacy settings for sharing clinical data.”

Seventy-three percent of respondents said they were willing to share their medical information, but selectively. They were willing to share at least one piece of data with at least one type of research group.

Most patients were most willing to share their data with researchers from their home institution, followed by separate non-profit institutions, and then finally with teams at for-profit organizations.

“The reluctance to share data and biospecimens with researchers from for-profit institutions needs further investigation because the category aggregates highly different industries and further refinement might reveal subgroups that have higher association with declining to share than others,” the research team said.

“Strategies to convey how data and biospecimens are being used or will be used for research that includes the development of commercial products to improve health outcomes need to be developed and implemented so that patients can provide consent that is truly informed.”

Additionally, the surveys showed that patients were willing to share some, but not all, of their personal information, which could have implications for research teams accessing patient EHRs.

“This finding is important,” wrote the authors, “because the item to withhold may not be of relevance to a certain study, but the current all-or-nothing option, if chosen, would remove that patient’s data from all research studies.”

The researchers pointed out that there needs to be a more sophisticated mechanism by which researchers can access patient EHRs. A medical record should not be prohibited from a study because the patient has withheld one singular piece of personal data, the team said, especially if that data point is not relevant to a specific study.

IT developers should look for ways to stratify patient data sharing to allow for researcher access to more patient records.

Furthermore, the surveys showed that how a provider asks for patient data access is important. Opt-out forms, which assume patient data access unless a patient says they do not want to participate, are more effective than opt-in.

Additionally, whether the patient used the simple or detailed questionnaire had little impact on whether the patient gave permission to share certain types of medical information.

“This is important because a simple form could be used in the future to elicit choices from all patients, saving their time without significantly affecting their privacy preferences,” said Ohno-Machado. “However, different rates of sharing are expected for opt-in and opt-out of sharing clinical records for research.”

These findings are not a panacea for eliciting patient data sharing, Ohno-Machado said. Instead, they point out contradictions that research teams will need to assess when designing data sharing and opt-in communication protocol.

“Institutions currently make decisions on sharing on behalf of all patients who do not explicitly decline sharing. It is possible that asking patients directly would increase the amount of data shared for research,” Ohno-Machado concluded. “On the other hand, it is also possible that some types of research would suffer from small sample sizes if patients consistently decline certain categories of items.”

Mike Miliard

Researchers used open source tech from IBM Watson to build an AI model that would ingest clinical data from de-identified sepsis patient EHR data, then used it to predict patient mortality during hospitalization and during the 90 days following discharge.

Geisinger and IBM this week announced this week that they've co-created a new predictive model to help clinicians flag sepsis risk  using data from the integrated health system's electronic health record.

The new algorithm created with help from IBM Data Science Elite will help Geisinger can create more personalized clinical care plans for at-risk sepsis patients, according to the health system, which can increase the chances of recovery by helping caregivers pay closer attention to key factors linked to sepsis deaths.

Dr. Shravan Kethireddy led a team of scientists to create a new model based on EHR data. Partnering with the IBM Data Science and AI Elite teams, researchers assembled a six-person  team to develop a model to predict sepsis mortality as well as a tool to keep the team on top of the latest sepsis research.

The researchers used open source technology from IBM Watson to build a predictive model that would ingest clinical data from thousands of de-identified sepsis patients spanning a decade, then used that model to predict patient mortality during the hospitalization period or during the 90 days following their hospital stay, officials say.

Sepsis is a potentially life-threatening condition that affects about 1.7 million American adults but is complex and, because symptoms such as fever and low blood pressure overlap with other common illnesses, is difficult to identify early. The infection is linked to more than 250,000 deaths annually.

The new algorithm is helping Geisinger researchers identify clinical biomarkers associated with higher rates of mortality by predicting death or survival of patients in the test data.

The project revealed descriptive and clinical features that could indicate heightened risk for sepsis such as age, prior cancer diagnosis, decreased blood pressure, number of hospital transfers, time spent on vasopressor medicines and even the type of pathogen.

"For clinicians, making a sepsis diagnosis can be very difficult, as the symptoms overlap with many other common illnesses," said Dr. Donna Wolk, division director of molecular and microbial diagnostics and development at Geisinger. "If we can identify patients more quickly and more accurately, we can administer the right treatments early and increase the chances of a positive outcome."

Geisinger has been a leader in its use of AI for predictive analytics. Earlier this year, its Steele Institute for Health Innovation launched a multi-year collaboration with with Medial EarlySign to implement machine learning technology for detection and prevention of chronic and high-cost diseases. And in 2018 we reported on its efforts to apply machine learning to imaging data to more quickly find intracranial hemorrhage.

"Our experience using machine learning and data science has been very positive, and we see huge potential to continue its use in the medical field," said Dr. Vida Abedi, staff scientist in Geisinger's department of molecular and functional genomics. "We are well on our way to breaking new ground in clinical care for sepsis and achieving more positive outcomes for our patients."

"It's very important for me as a clinician and a research scientist to save patient lives using all the knowledge of the data and the clinical background," added Dr. Hosam Farag, a bioinformatic scientist in Geisinger's Diagnostic Medicine Institute. "Machine learning can close the care gaps and optimize the treatment. That makes me passionate about how to save patient lives."

Heather Landi

Nearly two-thirds of health plans (63%) say they are using recently proposed federal interoperability regulations as the first step toward broader strategies on interoperability, according to a new survey.

This suggests compliance with the new standards will be seen as the bare minimum in healthcare interoperability programs, according to a survey from the Deloitte Center for Health Solutions. Forty-three percent of health system chief technology officers or chief information officers also said the proposed interoperability standards will the baseline for broader strategic interoperability initiatives.

The Centers for Medicare & Medicaid Services (CMS) and the Office of the National Coordinator for Health IT (ONC) published proposed rules back in February designed to drive the industry toward widespread interoperability.

CMS' proposed rule (PDF) would require Medicaid, the Children’s Health Insurance Program, Medicare Advantage plans and Qualified Health Plans to make enrollee data immediately accessible via application programming interfaces (APIs) by Jan. 1, 2020.

ONC also unveiled its information blocking rule (PDF) that defines exceptions to data blocking and fines that may be associated with the practice. The rule was mandated by the 21st Century Cures Act.

Many healthcare groups and stakeholders have voiced concerns about the interoperability rules, urging CMS to take a phased approach to implementing the standards, saying the proposed 2020 implementation timeline is "unrealistic." Some groups, like the Health Innovation Alliance, have called for the recently proposed interoperability rules to be scrapped and rewritten.

Federal policymakers are using multiple regulatory levers to advance interoperability such as new payment models, the Trusted Exchange Framework and Common Agreement and a recent executive order on transparency.

"Taken together, these initiatives showcase the administration’s continued push to make health care information more accessible by encouraging plans and providers to share data with each other to improve the quality and efficiency of health care and with patients to help them make informed decisions," Deloitte Center for Health Solutions executives wrote in a report about the survey results.

The survey found that some healthcare organizations are just ticking off the boxes before moving on to other priorities. Nearly half of health systems (49%) and 34% of health plans say they have no plans to go beyond compliance requirements as part of the new interoperability rules. Eight percent of health systems and 3% of insurance technology executives said they have not read the proposed rules or are still determining the implications of the rules.

The survey also revealed that most healthcare organizations are going beyond the interoperability solutions provided by their vendors. Half of health system (55%) and 60% of health plan executives say they are either building their own API solutions or are doing so even while they work with a vendor to build solutions.

About 40% of healthcare organizations are using vendors and have access to APIs provided by vendor applications or packages. Only about 3% of healthcare executives say their organizations currently do not use APIs.

"By 2040, we expect the system to be dramatically different than it is today," wrote the report authors. "Health will likely be driven by digital transformation enabled by radically interoperable data and open, secure platforms. Moreover, consumers will own their health data and play a central role in making decisions about their health and well-being."

Organizations that develop and implement a strategic approach to interoperability are likely to have a competitive advantage with insights, affordability and consumer engagement in the future of health. Healthcare organizations that fail to see beyond compliance deadlines will fall behind, Deloitte executives said.

The report authors recommend three key steps for healthcare organizations to consider when developing an interoperability strategy:

  • Define the interoperability vision for the organization—Organizations should leverage the regulatory requirements on interoperability as a jumping off point for their broader strategy for sharing data with industry stakeholders and with patients. Establish an initial interoperability governance structure, and develop a business case and key business/technology benefits. Also, identify high-value use cases such as enhanced care management and/or improved consumer and patient engagement.
  • Assess the current state—Evaluate the organization's current interoperability capabilities and define the desired future state, then conduct a gap analysis between current and future state. Develop an external engagement plan; for example, forge partnerships and encourage collaboration with external entities to better enable the interoperability vision.
  • Develop an execution road map—Prioritize a set of initiatives and road map to achieve compliance by the proposed Jan. 1, 2020, deadline. Identify longer-term goals beyond the Jan. 1, 2020, compliance date around data exchange, digital tool adoption and enhanced consumer engagement. Assign “high priority” to the must-do/critical capabilities.

Seth Augenstein

The health histories of some 20 million people in Kansas and Missouri, compiled in “comprehensive health records,” will be shared across state lines, according to a recent data sharing agreement.

The deal between the Missouri Health Connection (MHC), and the Kansas Health Information Network (KHIN) and subsidiary KAMMCO Health Solutions (KMS), will aggregate data from both exchanges — creating a new comprehensive medical record.

The new insights will lead to quicker treatments, reduce redundant testing and procedures, and improve coordination and decision-making, according to the agencies. 

Some experts told Inside Digital Health™ that the partnership is a net win for patients and caregivers, as well.

“The connection of the KHIN and MHC networks solves many challenges with the exchanging of electronic health data today,” said Laura McCrary, Ed.D., executive director of KHIN and president and CEO of KHS. “Patients’ medical records will be electronically available to their physicians and other healthcare providers any time of day. This is critically important as there are times a patient may not be able to communicate all of their health history to their physician or hospitalist in an emergency.”

The new records will be created using a “private and secure technology” producing a record that is longitudinal, and updated electronically and in real-time, according to officials. KAMMCO, doing business as SHINE of Missouri, is physician-led and has partnered with the Missouri State Medical Association.

Calling it an “epic win for Missouri, Kansas and the Midwest,” officials said the deal will affect a majority of patients in both states.

MHC’s network extends to more than half of the in-patient care in Missouri, through 75 hospitals, several hundred clinics and 14 community health centers.

KHIN’s reach in Kansas extends to more than 125 hospitals, nearly three-quarters of the physician practices, as well as pharmacies, home health providers, health plans and long-term-care facilities.

“Making a connection to each other was a sound way for MHC and KHIN to demonstrate our commitment to serving the healthcare providers in our respective networks,” said Angie Bass, president and CEO of MHC. “Data sharing between MHC and KHIN dramatically increase the value of health information exchange to our healthcare customers.”

McCrary told Inside Digital Health™ by phone from the Strategic Health Information Exchange Collaborative conference in Washington, D.C. that both exchanges create interfaces from the EHRs, to the secure interchange platform. HL7 (Health Level Seven) V.2 is the most common data transport method to build the data feeds for labs, notes and other categories. The CCD and ADT data feeds are held in a central data repository, and when a doctor or healthcare provider needs information, then their records system queries the repository, either automatically or through a manual portal.

Now, when either of the queries comes in, it will query the other state, McCrary said.

Kate Shamsuddin, M.S., the senior vice president of strategy at Definitive Healthcare, a Massachusetts-based healthcare data and analytics firm, said that both KHIN and MHC use some of the most common HIE systems – making it much easier to combine forces. Both entities also have a history of partnering with other organizations to expand access to data, she said. 

It’s likely to be a net win for patients, added Shamsuddin.

“According to Definitive Healthcare data, Missouri Health Connection currently serves a large number of rural hospitals in the Midwest, so this partnership will also help streamline remote care cases by allowing rural providers to quickly discover the patient record and deliver the appropriate care,” she said. “In rural areas, removing any barriers, particularly when dealing with time-sensitive events is absolutely crucial.”

KAMMCO’s health analytics and information exchange services are also used in Connecticut, New Jersey, Georgia, South Carolina and Louisiana.

Samara Rosenfeld 

Personal digital health profiles show promise in a step-wise approach to chronic disease prevention, according to research published in the journal BMC Public Health.
Of the 22% of patients advised to get a health check at their general practitioner, almost all of them (19%) did. And of the nearly 25% of patients advised to schedule an appointment for behavior-change counseling at their municipal health center, 21% took the advice.
Participants who had fair or poor self-rated health, a body mass index above 30, low self-efficacy, were female, non-smokers or who led a sedentary lifestyle were more likely to attend targeted preventive programs.
A Danish research team implemented a step-wise approach in the Danish primary care sector for the systematic and targeted prevention of chronic disease.
The researchers designed an early detection and prevention intervention for Type 2 diabetes mellitus, cardiovascular disease and chronic obstructive pulmonary disease (COPD). The intervention had two elements:

  1. General intervention. This involved the creation of a personal digital health profile for each individual in the study population.
  2. Targeted intervention. This intervention included a health check at the general practitioner or behavior-change counseling at a municipal health center. The targeted interventions were for patients who were deemed likely to benefit from such interventions due to their high overall risk of the chronic conditions or because they regularly engaged in health-risk behaviors.

More than 8,800 patients between the ages of 29 and 60 from 47 general practitioners participated in the study.
Participants received a digital invitation and consent form prior to the study.
The aims of the digital health profiles were centered on four key ideas:

  1.  To motivate and enable patients who otherwise would not have taken up a targeted intervention like the one offered.
  2. To motivate and enable patients with poor self-management skills to take up the targeted intervention.
  3. To guide patients with good self-management skills to change their own behavior.
  4. To keep the healthy and low-risk population from demanding unnecessary health checks from their general practitioner. 

Digital health profiles contained clear and concise personalized health information and recommendations for further action. Recommendations included advice to take up a targeted preventive program, facts about health-risk behavior, information about the positive impact of behavior-change and a personalized list of available and relevant behavior-change interventions.
Researchers created the digital health profiles based on the patients’ electronic health records and questionnaire information, which included health-risk behaviors, family history of disease, early symptoms of COPD and osteoarthritis.
Participants were then stratified into one of four groups.

The first group consisted of patients who had treatment for hypertension, hyperlipidemia, Type 2 diabetes mellitus, cardiovascular disease and/or COPD at their general practitioner. The patients in this group did not have any additional intervention beyond usual care.

 Patients in the second group were those would likely benefit from a health check at their general practitioner determined by three risk algorithms for the chronic conditions. These patients were advised to schedule a check with their practitioner, which included a medical examination and subsequent health counseling session.

In the third group were patients who were not flagged by the risk algorithms but had a body mass index above 35 and/or reported they regularly engaged in health-risk behavior. Risky behaviors included daily smoking, high-risk alcohol consumption, unhealthy dietary habits and sedentary leisure time activities. Patients in this group were advised to schedule a 15-minute telephone-based counseling session. These could be requested online through the digital health profile.

 Patients with a healthy lifestyle and no need for further intervention made up the fourth group.

Deciphering the Findings

Women and participants with sedentary leisure behavior were more likely to attend a health check at their general practitioner. General practitioner attendance rates revealed that physical activity was the strongest predictor of attendance. The attendance for those with sedentary behavior was 28%, while those who exercised during down time was 17%.
Of those who had fair or poor self-related health, 20% of smokers attended the telephone-based counseling session and 42% of the non-smokers attended.
Overall, the attendance rate for patients who were advised to schedule a health check and for those who were advised to schedule a counseling session was near 20%.
“This study suggests that a personal digital health profile may help foster a more equitable uptake of preventive programs in the primary care sector — especially among patients with lower self-efficacy and fair to poor self-related health,” the authors wrote.
The researchers suggest that further research is needed on personal digital health profiles.

Jared Kaltwasser

The premise of clinical decision support (CDS) software is clear: Use technology to leverage the power of big data to improve patient care and, theoretically, drive down costs. But the technology is, in many ways, still at the starting gate, in part due to technical and bureaucratic hurdles and a lack of scientific data surrounding its use.

Now, a major academic medical center, the University of Virginia Health System, is launching a concerted effort to find ways to integrate CDS into its organization. The evaluation is part of a larger effort at the health system to boost value-based care. It could prove to be a model to help other medical centers take the leap.

Joseph Wiencek, Ph.D., an assistant professor of pathology at UVA, said the health system is looking at products developed in house and elsewhere.

“We have several rules built from our internal analytics/informatics teams but would like to see if there is any value in external support that is becoming more widely available,” he told Inside Digital Health™. “These decision support tools would likely target high-volume, low-cost tests as well as high cost, low volume.”

The team remains in the evaluation and “data-crunching” phase, but he said one important element of the evaluation will be to get buy-in from the health system stakeholders.

“Since we are an academic teaching hospital, it is important to research and strategically partner with service lines and department leads to make sure we achieve lateral buy-in from our colleagues,” he said.

He and teammate Andrew Parsons, M.D., MPH, an assistant professor of medicine at UVA, recently wrote about how reduce laboratory costs, noting that “low-value care” — care that could be eliminated without harming patient safety — costs the U.S. healthcare system an estimated $800 billion each year.

They wrote that integrating decision support tools into electronic health record software could help reduce unnecessary costs. However, given the relative novelty of these types of products, health systems should evaluate them carefully before integration.

Wiencek told Inside Digital Health™ that there simply isn’t much in the way of scientific literature when it comes to the effects of CDS.

“I think the lack of peer-reviewed literature is an enormous impact,” he said.

While the technical work that’s required to implement the systems can be difficult, Wiencek said scientific evaluation will also be key.

“It truly is a multi-modal approach, and support from your colleagues and evidence-based literature will really be the only way these tools will succeed,” he said.

Ultimately, the success of any decision support technology will be about more than just cost. For some institutions, that could be the benchmark. But for others, victory might resemble patients receiving the right test at the right time, he said.

In fact, a recent study by researchers at the Massachusetts Institute of Technology found that CDS helped providers make more appropriate decisions but didn’t result in cost savings, in part because sometimes the most appropriate decision was a high-cost test.

Wiencek said while cost is a major issue in healthcare generally, he doesn’t think people should overemphasize its importance. Besides, he said, the cost implications of better decisions might not appear in the short term.

“Doing the right test for the right patient will lead to better clinical decisions, and patients will get the care that they need or be diagnosed faster,” he said. “If this happens, costs will fall, too. I’d like to see these types of tools lead us in that direction.”

Wiencek offered no timeline as to when UVA will complete its evaluations, though he said the team is proceeding at a “steady pace.” The health system is currently working with a single external vendor, though he said they have been approached by several others.

Samara Rosenfeld

The U.S. Food and Drug Administration (FDA) has released a letter in support of open data sharing through efforts like the Patient Safety Movement Foundation’s Open Data Pledge, according to an announcement today.

While the FDA did not sign the Open Data Pledge, which is meant for companies, it supports the principles of it.

“We encourage policymakers, healthcare entities including hospitals, digital health technology companies, medical device manufacturers and others to share data to support patient safety,” Jeffrey Shuren, M.D., J.D., director of the FDA Center for Devices and Radiological Health, wrote in a letter to Joe Kiani, founder of the Patient Safety Movement Foundation, which aims to eliminate preventable deaths.
The Center for Devices and Radiological Health supports data sharing to protect patients and promote public health, Shuren noted. He wrote that openly sharing data with patients, providers and researchers could:

  • Empower patients to participate in the development and evaluation of medical devices that meet their needs
  • Facilitate medical device surveillance and help identify and prevent adverse effects
  • Increase the FDA’s knowledge of the benefits and risks of technologies, which could enhance patient safety

When individuals and companies sign the Open Data Pledge, they agree to allow anyone who wants to improve patient safety to interact with their products and access the data that are collected. The agreement is subject to privacy laws.
“We are grateful for FDA’s recognition of our work and thank the nearly 100 enlightened companies that have signed the Open Data Pledge,” Kiani said. “Patient harm can be avoided with predictive algorithms and decision support using data from the myriad of products that touch the patient.”

Jay Haughton, RN

All across the United States, the delivery of care is stressful for both patients and doctors. Patients want better access to their information and to be actively engaged in their own care. Doctors want to spend more time with patients but face intense time pressures.

According to a 2018 survey, 60 percent of doctors report they spend between 13 to 24 minutes on average with each patient. During some of these precious minutes, they are struggling to follow electronic health record (EHR) requirements and processes. Current EHRs are not work-flow confluent as the patient is asked the same questions multiple times. Providers struggle with fragmented systems that require separate log-ins, and many of the processes are simply not clinically useful.

Click fatigue and multitasking can lead to mistakes. It’s estimated that multitasking immediately decreases productivity and accuracy by 40 percent. Additionally:

  • 70 percent of doctors using EHRs attribute the bulk of their administrative burden to the software, according to a 2017 study. However, doctors’ opinion of EHRs improved when their medical institutions made efforts to optimize how the software is used.
  • 92 percent of clinicians say lengthy prior authorization protocols have impeded timely patient access to care and harmed patient clinical outcomes, according to an American Medical Association survey.
  • 89 percent of senior patients (age 55 and older) surveyed said they want to manage their own healthcare—and will require better health technology access to do so.

A more thoughtful EHR can deliver a better experience for both sides. What’s needed is a tool that leverages cutting edge technology to deliver better usability, flexibility, and value, designed by clinicians who truly understand the healthcare workflow. For patients, an EHR should provide a patient portal that integrates data into a clinical registry, allowing access to all of their data in a single location.

Electronic enterprise-wide data is essential to manage the patients doctors care for every day. Unfortunately, current EHRs typically do not deliver the insights or tools providers need to manage their high-risk patients when they are not in the hospital. Even if the specific EHR does offer such population health management capabilities, it again requires excessive amounts of manual data access and manipulation, leading to time wasted and higher costs.

With the introduction of Medicare Access and CHIP Reauthorization Act (MACRA) and the 2015 Merit-based Incentive Payment System (MIPS), along with APMs, providers are being reimbursed by performance versus fee-for-service. One of the performance measurements is Promoting Interoperability (formerly Advancing Care Information), and new CEHRT qualified EHR systems are ready to meet this new requirement.

To improve outcomes via improved data sharing and automation, the next generation of EHRs offer these four improvements:

Usability: Make key clinical data easily available by streamlining workflows and navigation with fewer clicks and a common patient banner, which puts certain patient information in the same location regardless of application. This empowers providers to focus on the work that matters most. The EHR should integrate and aggregate data into a clinical registry, allowing patients to access all of their data from a single portal.

Flexibility: Care organizations have numerous regulatory requirements and certification standards. A better EHR allows organizations to create additional fields to meet the unique needs of their workflow. Organizations can define and link fields to medical code sets to stay current with ever-changing regulatory requirements and advancements in healthcare information technology.

Technology: Leverage the latest technology for a scalable and portable solution that meets doctor and patient needs today, while avoiding vendor lock and enabling constant improvements.  Solutions that use cloud-based infrastructure can do this while keeping patient data secure and up- to-date.

Value: Next generation  EHR solutions do not need to be costly. They can provide greater value—including all implementation and support costs—without sacrificing functionality. Cloud-based infrastructure eliminates the demand for large in-house IT staffs and data storage, allowing outsourced IT to handle the heavy lifting.

Both sides of the healthcare equation are under strain, and it doesn’t have to be this way. Technology has created the challenge, and better technology can provide the solution. It’s past time to fulfill the original promise of EHRs—reducing risk, improving efficiencies, and supporting high quality patient outcomes.

In a clinician survey, researchers found healthcare organizations that invest in EHR training report higher levels of EHR user satisfaction.

Kate Monica 

Administering sufficient EHR training to clinicians may be the key to improving rates of EHR user satisfaction, according to a recent clinician survey by the KLAS Arch Collaborative.

Researchers including Julia Adler-Milstein, Christopher A. Longhurst, and others analyzed survey data from the Arch Collaborative from tens of thousands of EHR users to identify the factors that influence whether a user will report higher levels of EHR satisfaction.

“We as an industry have an opportunity to improve EHR adoption by investing in EHR learning and personalization support for caregivers,” wrote researchers in the study.

“If health care organizations offered higher-quality educational opportunities for their care providers — and if providers were expected to develop greater mastery of EHR functionality — many of the current EHR challenges would be ameliorated,” they stated.

Researchers noted during their review that users of the same EHR system often report significantly different experiences with the software. Less than 20 percent of variation in user experience could be explained by EHR software type, while over 50 percent of variation resulted from differences in the way clinicians interacted with their system, researchers wrote.

“Similarly, within the seven EHR solutions measured, a very unsuccessful provider organization was identified in each customer base, and a successful customer was identified in six of the seven customer bases,” researchers stated.

Healthcare organizations interested in improving rates of EHR satisfaction among clinicians are more likely to see improvements if they invest in EHR training and assist users in becoming more adept at navigating EHR technology rather than investing in a new EHR implementation.

“In the Arch Collaborative large dataset, the single greatest predictor of user experience is not which EHR a provider uses nor what percent of an organization's operating budget is spent on information technology, but how users rate the quality of the EHR-specific training they received,” researchers wrote.

The team found 475 instances in their research in which two physicians in the same specialty used the same EHR system at the same organization and reported very different user experiences.

“In over 89 percent of these instances, the physician who strongly agreed also reported better training, more training efforts, or more effort expended in setting up EHR personalization,” emphasized researchers.

Researchers recommended healthcare industry stakeholders implement standards to ensure clinicians across organizations receive high-quality EHR training.

Recommending healthcare organizations administer at least 4 hours of EHR training may help to improve rates of EHR satisfaction industry-wide.

“Organizations requiring less than 4 hours of education for new providers appear to be creating a frustrating experience for their clinicians,” wrote researchers. “These organizations have lower training satisfaction, lower self-reported proficiency, and are less likely to report that their EHR enables them to deliver quality care.”

Standardizing the way EHR training should be structured would be more challenging. Researchers observed significant variation in the ways different healthcare organizations structure their training and educational programs, and were unable to indicate a single training program structure that achieved better results than every other.

However, researchers did note user personalization features were typically underutilized during EHR training programs.

“One of the most consistent observations seen across the collaborative organizations is how powerful EHR personalization can be and how much adoption is lacking today,” wrote researchers.

Investing resources in ongoing education that assists EHR users with system personalization may help to promote EHR optimization and improve rates of EHR satisfaction.

Looking ahead, researchers recommended healthcare organizations prioritize EHR training so that clinicians fully understand the limits of their systems and are confident in their ability to navigate the technology.

“While the Arch Collaborative research has convinced us that the greatest opportunity for progressing the value of the EHR currently lies in improved user training, this approach clearly needs to be balanced with a parallel focus on better designed and smarter software that can better meet nuanced needs of health care,” noted researchers.

“For EHR software to revolutionize health care, both the software and the use of that complicated software need to progress in parallel,” the team added.