Like most websites The Translational Scientist uses cookies. In order to deliver a personalized, responsive service and to improve the site, we remember and store information about how you use it. Learn more.
Outside the Lab Informatics, Personalized medicine, Professional development

Asking the Ultimate Question

Over the last 30 years, I’ve watched biomedical research rapidly embrace new technologies aimed at developing better drugs and improving patient care and outcome. This evolution extends from molecular modeling to bioinformatics, translational medicine, and now the conversion of personalized medicine into precision medicine and its enhancement with big data. Although these approaches typically develop from academic research, they have all migrated to commercial activities (and investment opportunities), while promising to improve healthcare.

In many cases, approaches have evolved from breakthrough science to commoditization and integration into standard research practice. For example, molecular modeling progressed from computational/quantum evaluation of chemical properties to visualization/graphics and molecular dynamics. Now, no drug is developed that does not use some form of this analysis. Bioinformatics evolved from protein structure–function analysis to sequence analysis of proteins, nucleic acids and genomes. Molecular biologists now routinely apply complex algorithms developed in advanced research in disparate areas.

Today, precision medicine is replacing personalized medicine. I believe this reflects a focus on selection among existing medicines, rather than the development of drugs that only work for an individual. The more limited definition focuses on genomic data while the broader view includes clinical history, lifestyle, and environment.

Big data integrates results from many different approaches along with clinical data. The term indicates “more data than can be adequately managed with available algorithms, storage and visualization technologies.” But these boundaries continually evolve so today’s “big” data, is tomorrow’s “typical” data. Extensive data mining efforts are applied to identify new correlative relationships. The associated technologies that support these progressions range from high-performance computing and “the cloud”, to array technologies and next-generation sequencing.

So, we’re seeing a boom time of great advances. However, good basic research still does not routinely lead to actual clinical utility. The difficulty typically lies not in the “handoff”, but rather an inability to recognize the difference between “unmet clinical need” and “unstated, unmet clinical need.”

“Unmet clinical need” implies under-served diseases, such as Alzheimer’s disease or ALS, where clinical needs are not adequately met for lack of the right tools or approach.

Unstated, unmet clinical need describes a gap in knowledge or adequacy of existing processes and procedures in clinical practice; for example, diagnosis and disease stratification, or understanding the complexity across the patient, physician, provider, payer, pharma, regulator, family/caretaker and community interface. An example of this is the difficulty in treating heart failure patients with “preserved ejection fraction”. The diagnosis itself involves dealing with a complex syndrome and subjective evaluation of the patient. Even the determination of a specific threshold for “preserved ejection fraction” remains difficult to support based on observational data. As a result, seemingly definitive criteria for diagnosis can yield an extremely heterogeneous population for evaluation of new therapeutics and result in limited success.

Biomedical research too often focuses on known unmet clinical needs, while unstated needs receive very little attention or funding. Our emphasis on producing data/observations and correlations misses the most critical point: asking the right question in the first place. As W. Edwards Deming said, “If you do not know how to ask the right question, you discover nothing.” In The Hitchhiker's Guide to the Galaxy, the Deep Thought computer takes seven and a half million years to calculate the answer to “life, the universe and everything”, only to give the answer “42”, explaining that to calculate the question will take much longer. In our eagerness to pin down the answer to faster clinical translation, we often neglect to ask the right questions!

I believe this stems from a) the emphasis in science education on hypothesis-driven research, b) parallel development of technologies that are supportive (and can be commercialized) and, c) the expectation that their combination will yield solutions. While I fully support the value and contributions of new technologies, I am concerned that they limit one’s ability to “see the forest for the trees”. Interestingly, medicine has begun to evaluate “design thinking” – which starts with a goal instead of a specific problem – in delivery of care, to change patient waiting and treatment areas, admissions procedures, and so on. However, it is not yet being applied to focus research on real clinical needs even beyond basic concepts. This reduces the value of basic research (engineering) and acknowledges the dichotomy between “pure” and “applied” research, where the former develops novel ideas and concepts that can be used to address issues in the latter. Design thinking actually sits in between these two and attempts to utilize the strengths of both.

Through my interactions with clinicians, I have observed several specific gaps that, if addressed, could greatly impact the development and translation of research into the clinic by first identifying outstanding clinical issues:

  1. Disease stratification. Most diagnoses represent syndromes or complex disorders that need resolution into clinical subtypes. The Institute of Medicine estimates that 10 percent of patients are misdiagnosed, but this significantly underestimates the impact of not using disease stratification based on clinical presentation to improve patient care and outcome.
  2. Co-morbidities and polypharmacy. Virtually all patients come to a physician with a history of previous disease, current disease or additional undiagnosed disease. Patients are often taking multiple prescription medications and over-the-counter remedies that will impact diagnosis and response to treatment.
  3. Clinical trials do not enroll real-world patients. Clinical trials rarely deal with the complexities of either the disease (for example, stratification) or the patient (for example, co-morbidities).
  4. Comparative effectiveness. If the physician doesn’t prescribe the drug according to guidelines or the patient does not take the drug as prescribed, any drug can be rendered ineffective, so simple comparison of efficacy between drugs in a clinical trial is not adequate to predict effectiveness in real-world medicine.
  5. Disease is a process. In disease, biological processes may change over time, which can be monitored using clinical observations to define the “dimensions” of the disease. The direction of this vector defines the disease subtype. How far along the vector a patient is defines their "stage", and how quickly they progress along the vector defines their velocity.  In chronic diseases such as diabetes, the patient’s underlying biology is also in a state of change and this can impact the presentation of disease. These can (and should) be addressed mathematically toenhance potential diagnosis and treatment.
  6. Biomarkers are not diagnostics. Biomarkers are measurable indicators of the status of underlying biological processes. Diagnostics are indicators of the presence of disease or stage of disease progression. These are not necessarily the same. Although diagnostics are used to indicate disease state or stage, they are not typically based on understanding the disease etiology, but instead are accessible markers for measurement.
  7. Clinical guidelines. Guidelines are typically developed using a consensus method or involving evidence-based methods; for example, randomized clinical trials. In a consensus guideline, potential variability in the confidence associated with each step is not presented in a transparent manner that would enhance clinical decision-making. In evidence-based guidelines, the use of varying inclusion/exclusion criteria and lack of comparison to real-world patients, as noted above, limits generalized use. In each instance, greater transparency could enhance the utility of guidelines in common practice.
  8. Electronic health records. Current efforts focus on achieving inter-operability while maintaining privacy. Unfortunately, little effort focuses on what data should be included in the electronic health record to make it useful. Learning from the experiences of countries where nationalized healthcare systems already have universal electronic health records could greatly benefit compliance and utilization for new efforts in this area.

The reality is that most physicians, when faced with a patient across the desk, cannot take the time to wait for solutions to these issues, and typically may not even acknowledge them on a daily basis. But in the application of design thinking, these issues become the focal point for research and action. Big data can provide the mechanisms to identify and collect the data critical to address these problems. Translational research/medicine can focus on developing solutions or partial solutions, and precision medicine can provide the mechanism for delivering the results to the patient. But while all of these techniques will help us find answers, it is asking the right questions that will deliver real benefits to patients.

Receive content, products, events as well as relevant industry updates from The Translational Scientist and its sponsors.

When you click “Subscribe” we will email you a link, which you must click to verify the email address above and activate your subscription. If you do not receive this email, please contact us at [email protected].
If you wish to unsubscribe, you can update your preferences at any point.

About the Author
Michael N. Liebman

Michael Liebman has more than 40 years of experience split between academics and industry, having traversed that “chasm of culture” several times with a stop along the way to lead a US Department of Defence-sponsored research institute. His career has taken him from starting at the (sub)-molecular level to today, where he focuses on real-world medicine and patients, developing and implementing methods for quantitative and qualitative analysis. Michael applies system-based approaches to define the full complexity of the problem. His approach to modelling applies design thinking and is complementary to the more traditional methods of statistical analysis and data mining.

Register to The Translational Scientist

Register to access our FREE online portfolio, request the magazine in print and manage your preferences.

You will benefit from:

  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts