The United States spends a larger percentage of its gross domestic product on healthcare than other wealthy countries, yet the nation's health outcomes are worse. And while that is an oft-cited reality, almost half of the country's expenditures are tied up in administrative costs -- which could potentially be addressed with artificial intelligence.
AI technology is already in use in certain areas of healthcare, such as diagnostic imaging. Yet the authors of a new study published in the Journal of the American Medical Association contend it's being underutilized thus far.
One of the big opportunities in which AI could help is population health because it is a tricky endeavor. Pop Health's goal is to achieve a baseline of care delivery and quality across certain groups or populations, and once goals have been met, new ones emerge. Because it's an ever-moving target, it's a good candidate for machine learning, the authors said. Machine learning can spot trends and patterns that often go unnoticed by physicians.
Through "unsupervised learning," a subcategory of machine learning that essentially removes the need for human involvement, AI can utilize EHR and financial data to pinpoint patient groups that share common characteristics, an important step in treating disease across those populations. That in turn assists in preventive and predictive care, which generates better outcomes both financially and in terms of patient health.
AI can also speed up the practice of evidence-based medicine, in which clinical studies and other evidence are marshaled to find the best course of clinical action, a step that, again, has the potential for savings, the authors said.
Developing new drugs and vaccines also stand to benefit from an injection of AI, particularly since drug development is time-consuming and costly. This is already starting to happen, with IBM Watson and Pfizer, for example, teaming up to use AI to sift through reams of data, including more than 30 million lab and data reports.
Despite the promise of AI, there are challenges, the authors noted, including a costly initial implementation, and potential regulatory hurdles such as the requirement that new technology undergo approval from the Food and Drug Administration.