The twin challenges of data integration and data quality

ContactPoint, IMPACT MOPI – a data quality strategy for joined-up servicesFor many public sector organisations, the twin challenges of data integration and data quality continue to be a barrier to effective information sharing and joined-up service provision. This is true whether applied to a police force seeking to integrate criminal records for the IMPACT MOPI initiatives, or a local authority and its partners attempting to build a single citizen view or ensure accurate child support data for ContactPoint. These data quality responsibilities are likely to pose the biggest single resistance to national systems delivery.

National challenge
Although public sector information sharing projects tend to be run centrally, it is the local organisations themselves that are responsible for ensuring the data they supply is accurate, complete, valid, reliable, timely, free of duplication and in keeping with required national standards and formats.
Not all local authorities, police forces and other agencies will recognise the size of their data quality challenge quickly enough to redress it. This is because their local systems may operate well with existing data, giving them false confidence. But that data, although suitable when isolated to a single, local system, may not be fit for purpose for either local or national integration projects:

  • Different systems hold data in diverse formats that conform to varying standards. They may also exhibit alternative spellings or inaccurate addresses.
  • These data quality variations make it very difficult to ensure accurate record matching across systems, so the data cannot be used to create utterly reliable single views.

For ContactPoint, local authority agencies such as social care, education, health trusts and youth offending need first to submit their data to the web-based Local Data Quality Toolkit (LDQT) to check that it meets ContactPoint validation rules. The data is given a red, amber or green status as to its acceptability.
Under IMPACT MOPI, data within the six Bichard business areas (Intelligence, Custody, Crime, Firearms, Domestic Violence and Child Abuse) must comply with the national IMPACT data quality standards before implementation of the Police National Database in 2010.
Data that does not pass these stringent checks will be rejected, leaving the local agency to determine where the problems lie and how to rectify them.

Local strain
Ensuring data is fit for purpose is a strain for local IT departments. Many have no experience of data migration; others may have made it the responsibility of the supplier of new systems.
This lack of experience exposes local agencies to the real risks of failing to meet their data commitments within their financial and timescale targets.

An answer
One of the most important initiatives local bodies should undertake quickly is to test whether their data complies with national standards. If the data fails, then they must analyse their data to identify the issues. They then need to take action quickly or risk running out of time.
Data profiling and discovery, if performed manually, is a detailed and complex activity that is unlikely to uncover all the problems. Fortunately though, data structure, rules and quality can be profiled using automated data quality solutions such as the Trillium Software System from Harte-Hanks. It gives data analysts and stewards an almost immediate indication of where problems lie – across multiple source systems. It allows drill-down to view the types and frequencies of issues and provides access right down to the data rows identified.
Having located the issues, strategies need to be defined to improve existing data and to ensure better data capture practices in future. The Trillium Software System again automates these processes. Its data cleansing and standardisation functions interpret and reorganise data within any context, from any source – even free text disassociated from relevant fields. It then matches and unifies records into composite views connecting services, people, objects, locations and events, for example, ready to be uploaded to ContactPoint, MOPI and other initiatives.
Naturally, given the sensitive and personal nature of the information being handled, the automated data quality process must be auditable, allowing the rules to be examined and actual records to be viewed. While many of these rules are built in as standard, they are also adaptable allowing compliance with highly specific requirements and changes in regulations.
Once existing local data has been brought up to nationally acceptable standards, it is important to ensure that all new data complies as well. Real-time data quality governance can be implemented using the Trillium Software System to check new data on entry, cleanse it and match it with existing records. Compliance can then be monitored over time; a dashboard allows agencies to ensure their data supports the delivery of cross-service, cross-force intelligence in support of increasingly effective public services.
For a demonstration of the Trillium Software System or discussion with existing public sector customers, please contact:-

For more information
Marina Odendaal
Harte-Hanks Trillium Software
Tel: 0118 940 7669