Maintaining data integrity’s essential for all organisations and poor data integrity can have many negative consequences. Businesses may be less competitive, experience reputational damage or fail to meet regulatory standards, and many organisations may find it more difficult to make evidenced-based decisions among numerous other implications. At worst, poor data management can even lead to serious consequences for some organisations like misidentifying patients and issuing incorrect treatments.
For those, like PensionBee, who operate within the pensions industry, failure to maintain data integrity could impact numerous areas of service. The Pensions Regulator (TPR), which regulates the UK’s workplace pension schemes, emphasises that without accurate record-keeping there could be a variety of consequences ranging from a pension scheme failing to meet its legal requirements to harming their ability to check employers are paying the correct contributions to their employees. In 2019, TPR found that hundreds of pension schemes had failed to review their data within a three-year period, let alone perform an annual review as per their expectation. Among other issues, this may mean workplace pension savers may not receive the pensions they’re entitled to.
With the potential cost of poor data integrity so high, it’s important that organisations are able to maintain a set of data they can readily rely upon. To this end creating a ‘golden record’ of data is sought after for its ability to not only help organisations meet compliance rules but also create more efficient processes and unlock new opportunities. A golden record of data refers to a consolidated data set which is considered to be the single source of truth for all the data a business holds about a customer, employee or product.
Building a golden record involves collating data stored across potentially numerous systems, such as customer relationship management (CRM) and enterprise resource planning (ERP) databases, and harmonising them into a single data set. The idea is that the golden record data can be safely assumed to be correct and the most reliable data available.
No matter whether an organisation functions as part of a government, within a healthcare or education setting, or as a non-profit or commercial organisation, a golden record of data can positively impact how efficiently it runs day-to-day and create new opportunities.
Why create a golden record for data?
It’s easy to imagine that as the amount of data stored across multiple systems increases so too does the probability of errors and mismatches among those data records. This makes it difficult and time-consuming for a business to use such data as it becomes difficult to identify and decide which are the correct details for an entity such as a customer or product.
Without a golden record, data are often duplicated and sometimes incomplete across databases. Anyone who’s ever searched through databases of records has probably come across these issues; for instance, encountering several versions of someone’s name, where one may be misspelt, or finding multiple email addresses recorded for the same individual.
Take, as another example, the name of a business customer. This may be recorded in an invoicing system as Arkwright Enterprises, whilst being also recorded in an ERP system as A. Enterprises. Or perhaps the postal address entered by a customer when registering on an e-commerce site may include a line not recorded in the CRM database. If all of these systems were to be merged into one master data set without cleaning the data beforehand, each of these records would be included as a separate entry along with any errors in the data itself.
The benefits of creating a golden record
A golden record of data can open up many benefits, of which some of the most significant are listed below.
Compliance and regulation
It’s much easier to fulfil compliance-related requests such as a Subject Access Request (SAR). SARs were created as part of the General Data Protection Regulation (GDPR) and Data Protection Act (DPA) legislation, introduced in 2018, enabling consumers to request the data a business holds about them and details of how it’s being used. Should a business receive one of these requests it’ll find itself in a much better position to provide all the data necessary as well as information on where and how it’s stored.
More incisive decision-making is enabled by greater insights into a data set. For example, a business may notice certain trends in consumer behaviour which may help them develop a new product or marketing strategy.
Enhanced marketing activity
Marketers are provided with a more holistic and meaningful picture of their customers. This could give them the ability to create more highly targeted campaigns, and ads which are better personalised to customers or open up opportunities to cross-sell other items.
Operational costs and complexity
A golden record of data can provide several benefits to the way a company operates. Operational costs such as data storage are reduced when there are fewer data entries which need to be maintained.
Data can be more quickly and easily searched and analysed because it’s immediately available in the highest quality possible and because there’s only one central record to look at instead of searching multiple systems.
As the fields in the master data set are linked to the corresponding ones in a data source such as a CRM system, when those fields in the original data source are updated so too is the master record, thereby helping to ensure your master data set always has the most recent data.
The challenges of creating a golden record of data
The benefits of a golden record can transform how successful an organisation can be yet reaping the benefits depends upon having a well-thought-out and implemented plan for creating a golden record of their data. Compiling a record of data, free from inaccuracies and duplication, isn’t necessarily a quick or straightforward process and often requires several carefully applied steps to get to that point.
Identifying and standardising data
Prior to being merged into a golden record, all the sources of data need to be identified. This can be more complex in some organisations which may have a large and growing number of systems, each recording data separately.
The fields within the data source should be reviewed to ensure they’re as complete and accurate as possible. For example, there may be a value for an item in a data source which has been entered into the wrong field, such as a customer’s full name being entered entirely in the first name field and not split across both first and last name fields. The fields in the data source should also follow the formatting requirements of the golden record. For instance, the format for a date field in a data source should match that used by the golden record.
Matching and merging data
Next, matches in the data need to be identified to remove any duplicates. In a customer database, for instance, there could be numerous customers with similar names. Are these the same individual? If so, are the variations in the name simply misspellings? Or was an error made in editing the wrong entry at some point meaning the different data entries are mixed up with each other?
Where any duplicated data has been identified, a decision has to be made as to which field, from which data source, should supersede all others and become the authoritative version of that item of data which can be merged into the golden record. Going back to our earlier example, when deciding between two names recorded separately, should a business go with A. Enterprises from data source one or Arkwright Enterprises from data source two? In this case, a business may choose to use the data from data source two as this source is generally more reliable in recording business names. A scheme which records employer contribution data, such as contribution amounts, date of payments and recipient data, in multiple places, will need to decide and resolve which are the most accurate instances of the data if inconsistencies are found.
However, there could well be more complex scenarios, particularly where it’s not clear that the data from multiple sources are referencing the same entity. In circumstances like this, a manual review may be needed to decide which version of a record should be chosen for merging. Whilst manual reviews may slow down the process of creating a golden record they may be the only way to judge which data source to use and the short-term trade-off in time should reap benefits over the long term.
Unlocking the power of the golden record
Having a golden record of data holds tremendous value for organisations across industries, from meeting compliance obligations at a minimum to helping to propel commercial business ambitions, yet the advantages it offers may be limited without a rigorous process for creating one in the first place. There should be well-designed data source selection and cleansing criteria, including manual data cleansing where it’s needed, as well as a process for how future updates to a golden record are made. Establishing a method which includes controlling and approving future changes will help protect its integrity and produce the kind of quality data that can help make transformative business decisions.