Ensuring Transparency on the Quality of Third-party Data

Building a proactive Data Quality Framework

Challenge

Getting insight into the in-market sales of a pharmaceutical company

Most pharmaceutical companies don’t sell their medicine directly to patients, they sell them to Health Care Professionals like hospitals and pharmacies. Those Health Care Professionals act as intermediaries in the medical sales process. This is the reason pharmaceutical companies, who are the producer of the drugs, are often losing track of its in-market sales. They lack answers to key questions like “Are we losing or gaining patients?” or “How is our newest drug performing?”.

This was also the case for one of our clients. As a manner to cope with this blind spot, our client spends millions of dollars per year on purchasing commercial data from an external data provider. This way, our client is able to get insight into their in-market sales by analysing this third-party data. To keep track of KPIs, our client has built several dashboards in a Business Intelligence environment. However, the business stakeholders who consumed these dashboards had little trust in them. Mostly because there was no visibility on the quality of the underlying data.

One of the datasets contained aggregated sales numbers such as the total sales per speciality and per month. The aggregation of raw sales data was done by the third party. This results in a lack of transparency on how the data has been transformed.

To increase the trust in the third-party data and the KPI dashboards, the IT department at our client has to perform data quality checks manually. This appeared to be a rigid process that hindered the proactive identification of data quality issues.

    Approach

    Building a proactive Data Quality Framework

    Our approach was to implement a robust data quality framework that would empower business stakeholders to detect data quality issues in the third-party data themselves, in a proactive way. With Collibra Data Quality & Observability as the tool of choice, the focus was on implementing an easy-to-use and intuitive platform. Our aim was to enable new users to seamlessly participate in the data quality process, fostering a culture of data cleanliness and accuracy.

    Examples of technical and functional support provided to our client for this use case were:

    1. Custom data quality rules: assisted business users in writing customized Spark SQL queries in Collibra DQ to implement some of their most complex data quality rules.
    2. Resolution process: trained the business users to leverage the “Link IDs” functionality in Collibra DQ so they could identify the breaking records in the source datasets themselves, and report this to the third-party data provider to resolve the data quality issues.
    3. Integration with Data Governance: integrated their main data governance platform, Collibra DIP, and data quality tool Collibra DQ, to automatically push the latest data quality scores from Collibra DQ to Collibra DIP.
    4. Trainings to business users: provided custom live trainings to business users on how they could leverage the different functionalities within Collibra DQ, to enable self-service use of the tool.

        Impact

        More trusted insights into the in-market sales

        The implementation of the proactive data quality process led to significant improvements for our client: data quality issues in the third-party data were easily detected, raised and resolved.

        • Improved decision-making: Business stakeholders are now clearly aware of any data quality issues in the third-party data and know to what extent they can trust the KPIs on the dashboards, resulting in more informed decisions regarding the in-market sales.
        • Shift to Proactive Data Quality: The more proactive role of the business stakeholders in the identification of data quality issues resulted in fewer issues down the line, as they are identified earlier in the data flow.
        • Better understanding of root causes: Business stakeholders are now able to monitor data quality issues over time, which enables them to identify trends and recurring issues in the datasets, allowing them to take preventive actions.
        • More flexibility: As business stakeholders are now implementing data quality checks themselves and not the IT department, DQ rules are easily adapted to their own specific local demands.
        • Reduced lead time for DQ checks implementation: The self-service data quality tool makes the implementation of DQ rules more accessible and more efficient for the business, resulting in lowers costs for implementing new DQ rules.

        This successful use case has sparked interest throughout the client's organization, as it clearly demonstrates how a proactive approach on data quality can result in tangible business value. Datashift continues to drive improvements in data quality, helping our client achieve more trust in both third-party data and their own data, driving growth in the pharmaceutical industry.

        Shift from data to impact today

        Contact datashift
        From Data to Impact talks

        From Data to Impact talks

        More impact out of your data?