Blog from September, 2019

Sara Heath

Nearly one-quarter of patients would opt into data sharing for all of their information with any interested precision medicine research party.

Patients approve of data sharing and are willing to contribute their medical information to research projects, but according to a group of researchers from the University of California San Diego, there may be some strings attached.

These findings come in the context of the precision medicine and All of Us campaigns, which call for the use of patient data repositories to create targeted treatment approaches to improve care quality. Precision medicine efforts rely on patients being will to share data with medical researchers.

“The finding in this study that most patients were willing to share data from their EHRs and biospecimens with researchers is reassuring,” the researchers wrote. “Not only can biomedical research benefit from these resources but also a multisite learning health care system can continuously advance as a result of data-driven improvements to processes and associated outcomes.”

Overall, patients are willing to participate in precision medicine, but there are some caveats, the UCSD researchers reported in JAMA Open Network. A survey of over 1,200 patients revealed that most are willing to share at least some of their medical information with some interested research groups.

Patients filled out one of four surveys: a simple opt-in survey, a simple opt-out survey, a detailed opt-in survey, and a detailed opt-out survey.

The difference between simple and detailed surveys was the amount of data categories for which the patients outlined their data sharing preferences. Patients completing a simple survey had to opt into or out of sharing in 18 data categories, compared to 59 categories in a detailed survey.

Each survey also asked patients about with which types of researchers they would be willing to share individual survey items, including other researchers within their home organization, researchers at another non-profit organization, and researchers at for-profit organizations.

Overall, 67 percent of all survey respondents said they’d be willing to share all of their data with their own healthcare organizations, and 23.4 percent said they’d share all of their information with any interested research party.

This is good news for the precision medicine movement, which relies on a breadth of patient information for actionable insights, said study senior author Lucila Ohno-Machado, MD, PhD.

“These results are important because data from a single institution is often insufficient to achieve statistical significance in research findings,” Machado, who is also professor of medicine, associate dean for informatics and technology in the UC San Diego School of Medicine and chair of the Department of Biomedical Informatics at UC San Diego Health, said in a statement.

“When sample sizes are small, it is unclear whether the research findings generalize to a larger population. Additionally, in alignment with the concept of personalized medicine, it is important to see whether it is possible to personalize privacy settings for sharing clinical data.”

Seventy-three percent of respondents said they were willing to share their medical information, but selectively. They were willing to share at least one piece of data with at least one type of research group.

Most patients were most willing to share their data with researchers from their home institution, followed by separate non-profit institutions, and then finally with teams at for-profit organizations.

“The reluctance to share data and biospecimens with researchers from for-profit institutions needs further investigation because the category aggregates highly different industries and further refinement might reveal subgroups that have higher association with declining to share than others,” the research team said.

“Strategies to convey how data and biospecimens are being used or will be used for research that includes the development of commercial products to improve health outcomes need to be developed and implemented so that patients can provide consent that is truly informed.”

Additionally, the surveys showed that patients were willing to share some, but not all, of their personal information, which could have implications for research teams accessing patient EHRs.

“This finding is important,” wrote the authors, “because the item to withhold may not be of relevance to a certain study, but the current all-or-nothing option, if chosen, would remove that patient’s data from all research studies.”

The researchers pointed out that there needs to be a more sophisticated mechanism by which researchers can access patient EHRs. A medical record should not be prohibited from a study because the patient has withheld one singular piece of personal data, the team said, especially if that data point is not relevant to a specific study.

IT developers should look for ways to stratify patient data sharing to allow for researcher access to more patient records.

Furthermore, the surveys showed that how a provider asks for patient data access is important. Opt-out forms, which assume patient data access unless a patient says they do not want to participate, are more effective than opt-in.

Additionally, whether the patient used the simple or detailed questionnaire had little impact on whether the patient gave permission to share certain types of medical information.

“This is important because a simple form could be used in the future to elicit choices from all patients, saving their time without significantly affecting their privacy preferences,” said Ohno-Machado. “However, different rates of sharing are expected for opt-in and opt-out of sharing clinical records for research.”

These findings are not a panacea for eliciting patient data sharing, Ohno-Machado said. Instead, they point out contradictions that research teams will need to assess when designing data sharing and opt-in communication protocol.

“Institutions currently make decisions on sharing on behalf of all patients who do not explicitly decline sharing. It is possible that asking patients directly would increase the amount of data shared for research,” Ohno-Machado concluded. “On the other hand, it is also possible that some types of research would suffer from small sample sizes if patients consistently decline certain categories of items.”

Mike Miliard

Researchers used open source tech from IBM Watson to build an AI model that would ingest clinical data from de-identified sepsis patient EHR data, then used it to predict patient mortality during hospitalization and during the 90 days following discharge.

Geisinger and IBM this week announced this week that they've co-created a new predictive model to help clinicians flag sepsis risk  using data from the integrated health system's electronic health record.

The new algorithm created with help from IBM Data Science Elite will help Geisinger can create more personalized clinical care plans for at-risk sepsis patients, according to the health system, which can increase the chances of recovery by helping caregivers pay closer attention to key factors linked to sepsis deaths.

Dr. Shravan Kethireddy led a team of scientists to create a new model based on EHR data. Partnering with the IBM Data Science and AI Elite teams, researchers assembled a six-person  team to develop a model to predict sepsis mortality as well as a tool to keep the team on top of the latest sepsis research.

The researchers used open source technology from IBM Watson to build a predictive model that would ingest clinical data from thousands of de-identified sepsis patients spanning a decade, then used that model to predict patient mortality during the hospitalization period or during the 90 days following their hospital stay, officials say.

Sepsis is a potentially life-threatening condition that affects about 1.7 million American adults but is complex and, because symptoms such as fever and low blood pressure overlap with other common illnesses, is difficult to identify early. The infection is linked to more than 250,000 deaths annually.

The new algorithm is helping Geisinger researchers identify clinical biomarkers associated with higher rates of mortality by predicting death or survival of patients in the test data.

The project revealed descriptive and clinical features that could indicate heightened risk for sepsis such as age, prior cancer diagnosis, decreased blood pressure, number of hospital transfers, time spent on vasopressor medicines and even the type of pathogen.

"For clinicians, making a sepsis diagnosis can be very difficult, as the symptoms overlap with many other common illnesses," said Dr. Donna Wolk, division director of molecular and microbial diagnostics and development at Geisinger. "If we can identify patients more quickly and more accurately, we can administer the right treatments early and increase the chances of a positive outcome."

Geisinger has been a leader in its use of AI for predictive analytics. Earlier this year, its Steele Institute for Health Innovation launched a multi-year collaboration with with Medial EarlySign to implement machine learning technology for detection and prevention of chronic and high-cost diseases. And in 2018 we reported on its efforts to apply machine learning to imaging data to more quickly find intracranial hemorrhage.

"Our experience using machine learning and data science has been very positive, and we see huge potential to continue its use in the medical field," said Dr. Vida Abedi, staff scientist in Geisinger's department of molecular and functional genomics. "We are well on our way to breaking new ground in clinical care for sepsis and achieving more positive outcomes for our patients."

"It's very important for me as a clinician and a research scientist to save patient lives using all the knowledge of the data and the clinical background," added Dr. Hosam Farag, a bioinformatic scientist in Geisinger's Diagnostic Medicine Institute. "Machine learning can close the care gaps and optimize the treatment. That makes me passionate about how to save patient lives."

Heather Landi

Nearly two-thirds of health plans (63%) say they are using recently proposed federal interoperability regulations as the first step toward broader strategies on interoperability, according to a new survey.

This suggests compliance with the new standards will be seen as the bare minimum in healthcare interoperability programs, according to a survey from the Deloitte Center for Health Solutions. Forty-three percent of health system chief technology officers or chief information officers also said the proposed interoperability standards will the baseline for broader strategic interoperability initiatives.

The Centers for Medicare & Medicaid Services (CMS) and the Office of the National Coordinator for Health IT (ONC) published proposed rules back in February designed to drive the industry toward widespread interoperability.

CMS' proposed rule (PDF) would require Medicaid, the Children’s Health Insurance Program, Medicare Advantage plans and Qualified Health Plans to make enrollee data immediately accessible via application programming interfaces (APIs) by Jan. 1, 2020.

ONC also unveiled its information blocking rule (PDF) that defines exceptions to data blocking and fines that may be associated with the practice. The rule was mandated by the 21st Century Cures Act.

Many healthcare groups and stakeholders have voiced concerns about the interoperability rules, urging CMS to take a phased approach to implementing the standards, saying the proposed 2020 implementation timeline is "unrealistic." Some groups, like the Health Innovation Alliance, have called for the recently proposed interoperability rules to be scrapped and rewritten.

Federal policymakers are using multiple regulatory levers to advance interoperability such as new payment models, the Trusted Exchange Framework and Common Agreement and a recent executive order on transparency.

"Taken together, these initiatives showcase the administration’s continued push to make health care information more accessible by encouraging plans and providers to share data with each other to improve the quality and efficiency of health care and with patients to help them make informed decisions," Deloitte Center for Health Solutions executives wrote in a report about the survey results.

The survey found that some healthcare organizations are just ticking off the boxes before moving on to other priorities. Nearly half of health systems (49%) and 34% of health plans say they have no plans to go beyond compliance requirements as part of the new interoperability rules. Eight percent of health systems and 3% of insurance technology executives said they have not read the proposed rules or are still determining the implications of the rules.

The survey also revealed that most healthcare organizations are going beyond the interoperability solutions provided by their vendors. Half of health system (55%) and 60% of health plan executives say they are either building their own API solutions or are doing so even while they work with a vendor to build solutions.

About 40% of healthcare organizations are using vendors and have access to APIs provided by vendor applications or packages. Only about 3% of healthcare executives say their organizations currently do not use APIs.

"By 2040, we expect the system to be dramatically different than it is today," wrote the report authors. "Health will likely be driven by digital transformation enabled by radically interoperable data and open, secure platforms. Moreover, consumers will own their health data and play a central role in making decisions about their health and well-being."

Organizations that develop and implement a strategic approach to interoperability are likely to have a competitive advantage with insights, affordability and consumer engagement in the future of health. Healthcare organizations that fail to see beyond compliance deadlines will fall behind, Deloitte executives said.

The report authors recommend three key steps for healthcare organizations to consider when developing an interoperability strategy:

  • Define the interoperability vision for the organization—Organizations should leverage the regulatory requirements on interoperability as a jumping off point for their broader strategy for sharing data with industry stakeholders and with patients. Establish an initial interoperability governance structure, and develop a business case and key business/technology benefits. Also, identify high-value use cases such as enhanced care management and/or improved consumer and patient engagement.
  • Assess the current state—Evaluate the organization's current interoperability capabilities and define the desired future state, then conduct a gap analysis between current and future state. Develop an external engagement plan; for example, forge partnerships and encourage collaboration with external entities to better enable the interoperability vision.
  • Develop an execution road map—Prioritize a set of initiatives and road map to achieve compliance by the proposed Jan. 1, 2020, deadline. Identify longer-term goals beyond the Jan. 1, 2020, compliance date around data exchange, digital tool adoption and enhanced consumer engagement. Assign “high priority” to the must-do/critical capabilities.