By leveraging tokenization-as-a-service, the threat of internal and malicious actors is essentially removed. 

With organisations now moving more aggressively into the digital sphere, data is becoming one of the most highly prized assets in the business equation. In its 2019 Pulse Survey, PricewaterhouseCoopers (PwC) found that consumer data is the most valuable for companies to harvest – and stated that by increasing data usability by just 10%, the average revenue boost for a Fortune 1000 company would be upwards of $2 billion, annually. No wonder then, that data breaches have also become one of the top risks for business leaders, with the 2020 Allianz Risk Barometer citing ‘cyber incidents’ as the most important risk for leaders today. Indeed, the financial and reputational costs of a data breach can be ruinously high – with 2019 research from IBM Security pinning the average cost of a data breach in South Africa at $3.1 million. Importantly, most data breaches involve human actors – with Verizon finding that 34% of data breaches can be traced to internal actors. So, it’s not the breakdown of sophisticated IT security systems that leaders have to worry about, it’s their own people.

Additionally, rigorous data privacy laws such as South Africa’s Protection of Personal Information Act (POPIA) and the EU’s General Data Protection Regulation (GDPR) carry stiff penalties for companies that fall afoul of legislation – and fail to manage their customers’ data correctly. British Airways, for example, is currently facing a fine of £183.39 million for GDPR infringements.

Hacker-proof data protection

But what if there was a way to remove the sensitive information from the data itself, without compromising its value (data usability)? If there’s no sensitive data to begin with, you can’t have a breach!

This is the promise of tokenization: the process of protecting sensitive data by replacing it with an algorithmically generated number called a token (while still retaining and storing the essential value). Within the payments industry, tokenization is increasingly used to protect sensitive information and prevent credit card fraud. For example, the system that TrustCommerce developed replaced the primary account number (PAN) with a randomized number, or token. When a merchant needed to process a payment, they could reference the token and TrustCommerce would process the payment on their behalf.

Yet this technology, which is both highly secure and cost effective to implement, can be applied to almost any scenario where there is sensitive information (digital assets) being stored. For large companies and organisations that sit with massive volumes of consumer data, for example, tokenization can effectively remove the risk of data breaches – by removing the assets (or value) that criminals are targeting. Unlike encryption, tokenization is a non-reversible process, making it doubly secure – and analysts argue that it is also easier to establish and maintain than encryption.

An emerging model: Tokenization-as-a-Service

It is perhaps unsurprising then, that the concept of tokenization-as-a-service – whereby this process is externally and professionally managed, is fast gaining traction amongst business leaders across sectors. 

Here’s how it works: in a normal data warehouse implementation, sensitive personal information (that is valuable for analysis) is distributed amongst the data – and hundreds of individuals within the organisation can potentially gain access. This constitutes one of the biggest concerns for leaders today. To mitigate this risk, tokenization-as-a-service replaces the sensitive personal information with tokens, and stores both the token and personal information data pairs in a secure vault. This secure vault can only be accessed via an API, or a very limited number of individuals. Importantly, having tokenisation delivered as a service means that individuals in the organisation can’t replicate the algorithm, as it’s not known to anyone sitting within the company. This radically limits the potential for internal bad actors. 

Thus for business leaders and decision-makers with the hefty responsibility of protecting sensitive consumer data, partnering with a provider who includes tokenization-as-a-service as part of their Privacy-by-Design offering will ensure that two key risks – data security and data compliance – are almost entirely eliminated. 

In this way, business leaders do not relinquish control of their data, and the value is still retained – while simultaneously closing the door on sophisticated data breaches. By protecting what is arguably among their most valuable assets through tokenization-as-a-service, organisations can focus on their core activity: serving their customers, and harnessing data to innovate from within. 


At Omnisient, we work with businesses and decision-makers to enable compliant and privacy-preserving data collaborations, data sourcing and data monetization. Contact us for a demo.

Leave a reply