With healthcare spending steadily on the rise in the United States, healthcare payers are beginning to make the shift from volume-based to value-based payment regimes. The main focus in doing so is to improve the quality of care while reducing the cost. One type of value-based approach is an Accountable Care Organization (ACO), which is defined as a “group of healthcare providers who agree to share responsibility for the quality, cost, and coordination of care for a defined population of patients. ” There are three core features of the ACO model.
The first is that the ACO consists of a strong base of primary care health provider organizations; the second is that performance measurement is designed to support ongoing improvement in patient care and population health; and the third feature involves the payments associated with high quality care and decreased costs. As a result, health providers in ACOs are tied to financial incentives. In order to reach ACO measurement goals, however, the health systems need to first reduce the cost of illness.
One way to meet this objective is to develop and utilize a strong IT foundation that fosters communication, coordination, and collaboration across clinical team members. Patients also need to be actively engaged in the process, as they are an essential component of the healthcare system as well. Standardization of data is essential to the functioning of an ACO. In particular, the area of predictive analytics heavily relies on big data to improve patient health outcomes. Being able to evaluate a patient’s surroundings allows for better predictions to be made, thus providing better-suited interventions for certain patients.
By discovering new ways to keep patients healthy, healthcare costs will be reduced and overall patient health outcomes should improve as well. In order for this to happen though, new data collection systems and analysis tools need to be applied, which requires participation and engagement from providers, patients, and policymakers. Although the standardization process is a challenge to the healthcare industry, many initiatives are currently taking place in both the public and private sector to better integrate and standardize big data.
In 2012, the National Institutes of Health launched a program called the Big Data to Knowledge Initiative (BD2K), which focuses on supporting the necessary IT tools and approaches to facilitate the incorporation of big data into the health sector. Health professionals are hopeful that over time, there will be an increase in the implementation of new technologies needed to contribute to the performance of an ACO. Even though interoperability of health data is not yet a streamlined process, the data as a whole are more readily available today than in the past.
The overall success of an ACO depends on the ability to retrieve this information, and many healthcare organizations are beginning to invest in the analytics and warehousing technology needed to do so. However, simply combining provider and payer data to get real-time patient analysis is no straightforward task. ACOs across the United States are in very different stages of employing this data to improve patient care and reduce costs. As of now, not enough ACOs have developed a functional big data warehouse to grab information from various sources, but many ACOs are beginning to take the necessary steps to make this happen.
Allina Health, a not-for-profit healthcare system based in Minnesota, is one example of an ACO that is moving forward in the big data analytics endeavor. The data analysts at Allina have produced approximately 60 dashboards, allowing providers and administrators to easily track outcomes against a variety of performance measures and target improvement efforts. Allina implemented its big data initiative in 2008 after contracting with Health Catalysts, a Utah-based storage and analytics systems developer, to design its data warehouse.
The database collects information from 42 different sources, which includes clinical data from Allina’s EHR, financial and claims data, and patient demographics. Having the ability to obtain real-time data at any desired specificity level, as well as track performance on critical measures, has greatly benefited Allina overall. Another example of an ACO that has recognized such benefits due to the use of a big data analytics platform is Meritus Health, a nonprofit hospital located outside of Baltimore, Maryland.
Last year, Meritus selected Explorys’ Enterprise Performance Management (EPM) Application Suite to support its ACO initiatives. The organization’s top priority when transitioning into an ACO was population health management, but just like many other organizations, Meritus was faced with numerous challenges due to disparate health information systems across the ACO. Explorys was the application of choice due to its ability to compare benchmarks across the network and report out on ACO quality measures.
Explorys’ EPM Application Suite helped to develop one cohesive data system in which Meritus can report on ACO performance by different levels, such as provider or office. This cohesive platform was estimated to minimize manual data abstraction to less than 30 percent. With more organizations moving towards value-based care models, there has been an increase in demand for big data solutions such as Health Catalysts and Explorys. Real-time collection of data allows a healthcare organization to make more accurate predictions and comprehend the costs needed to support its ACO initiatives.
Many smaller ACOs are not yet at the same level of sophistication as Allina or Meritus when it comes to big data analytics, but the hope is that as the data become more standardized over time, additional ACOs will follow suit. A complete and effective health data management system depends on multiple factors. First and foremost, executing a data governance strategy early on is critical. Administrators should determine what information is necessary and why it is needed sooner rather than later.
From there, the hospital can concentrate on the data that need to be gathered, as well as the information processes that will have to be managed in order to reach the anticipated outcomes. Establishing a centralized repository to house the data is another step that should be taken beforehand. However, a data warehouse alone is not going to be enough; the warehouse needs to be built on a relational database that is able to pull data from multiple systems and then integrate, sort, and analyze the data in a quick and easy manner.
In order for real-time reporting to be possible, the data need to be readily available, so having the ability to retrieve the information in various formats is also ideal since reporting purposes tend to vary. When it comes to quality control regarding health data management, healthcare professionals need to ensure that the information entered into the patients’ medical records is accurate. After all, it is the quality of data which drives the quality of predictive reporting results. Having unreliable data can undermine the hospital’s potential ability to leverage reporting analytics.
There are various ways to identify this untrustworthy data, making sure it is examined for consistency, timeliness, accuracy, and diligence. One way is through the use of cleansing tools, such as “algorithms that eliminate duplications or flag mismatched procedural codes. ” Quality improvement initiatives should also be developed to enhance the accuracy of data collection at the front end. These initiatives need to be pushed by the organization, as this will help to ensure that providers and staff are actively engaged. Risk management often poses a challenge to any healthcare organization.
One of the main objectives of effective risk management methodologies should be to focus on improving the patient experience, while also reducing the financial risk that an unfavorable event (in this case, a lack of effective infection control methods) would create for the organization. Ideally, risk managers need any and all information related to adverse event reporting readily accessible in a central location. A healthcare enterprise data warehouse (EDW) can meet this demand by serving as a single source of data storage in which the data is both aggregated and validated as it comes in from various areas of the hospital.
With the EDW platform in place, the hospital can then supplement the platform with a data entry tool that functions on top of the EDW, but stores the data in the EDW itself. By using this tool, risk managers can create simp forms and then publish them in an easy, collaborative manner. Another benefit of utilizing an EDW with a data entry tool is that everyone can view who has evaluated and signed off on an event, which not only streamlines the review process, but also helps to improve the accuracy.
Manual work is replaced with an automated process, so the likelihood of data entry errors is greatly reduced. As a whole, the risk management process requires risk managers and other hospital staff to be proactive in their duties. An EDW assists in making this possible by facilitating with the trending and surveillance of data. In regards to infection control, better analysis of administrative and clinical data will help the hospital to gain more clarity into hospital-acquired infection rates and any associated costs.
There are certain clinical infection surveillance systems that can also be implemented in which surgical procedures, pharmacy orders, and lab results can be analyzed in a real-time manner to help identify, manage, and control infection. From there, risk managers and clinicians can more easily identify any significant risk points. By taking on a more proactive risk management approach, staff members will be encouraged to positively collaborate with one another, and teams will hopefully come together to work towards improving overall patient care and safety.