Giving patients data about risks and benefits of a medical intervention is not always helpful and may even lead them to irrational decisions, according to an article in the Hastings Center Report. The finding calls into question whether it is essential to disclose quantitative data to patients to help them make informed decisions.

Currently, many patient advocates and others are embracing the “quantitative imperative”—the obligation to disclose risk-related data to patients to ensure informed consent and promote shared decision-making. Because patients often do not get information about all their options by talking to their health care providers, decision aids—pamphlets, videos, and computer programs—increasingly are being used to convey such data more comprehensively. There are more than 500 decision aids and more than 55 randomized controlled trials studying their impact. A recent review concluded that decision aids increase patient knowledge and the feeling of being informed while decreasing indecision and passivity.

The disclosure of quantitative data, however, can backfire, according to Peter H. Schwartz, MD, PhD, author of the article and faculty investigator at the Indiana University Center for Bioethics.

“There are important problems with it stemming from the way people understand and respond to numerical and graphical information,” says Schwartz.

In an accompanying commentary, Peter Ubel, MD, professor of marketing and public policy at Duke University, agrees with Schwartz’s analysis of the numeracy problem and argues that there are ways to present risk that overcome some of those problems.

One problem is that more than half of adults have significant difficultly understanding or applying probabilistic and mathematical concepts. National surveys suggest that at least 22% of adults have only the most basic quantitative skills, such as counting, while another 33% fare only slightly better and are able to do simple arithmetic.

But even people who have a good grasp of probability and math are prone to biases in how they interpret data on risks. They may give exaggerated importance to small risks or, conversely, exhibit “optimism bias” and exaggerate the chance that they will be in the “lucky” group, according to Schwartz. An individual’s psychology and the way the information is presented will determine which of these biases come into play. Either way, the bias can lead patients to make decisions about medical interventions that are not based on reason or facts.

Schwartz argues that clinicians should not always disclose all available quantitative data to all patients.

“While the data should always be available to patients who want it, the question is, how to offer it and in what form,” he writes. “These issues suggest that much more empirical research and ethnical analysis are required about the use of quantitative information in decision-making.”

Schwartz adds, “Questions about how and when to disclose quantitative information will become ever more pressing as advances in epidemiology and genetics provide increasingly precise ways to characterize the risks that patients face and the possible impacts of preventive treatments.”

In his commentary, Ubel states that his studies show that whether decision aids improve patient decisions depends on how it is constructed. For example, pictographs proved better at conveying risks than narrative or other kinds of graphic information. He argues for more research into how best to present information about risk to patients so as to aid decision making.

Source: The Hastings Center