October 16, 2018

How to share sensitive data efficiently? Tokenize it!

Mikko Peltonen

Lead Solution Architect, Security Services, Tieto

Tokenization is gaining more traction as a method to protect sensitive data and to lower cyber security risks. Interest is rising due to GDPR requirements, for instance. What are the benefits of tokenization, and how does it compare to encryption?

In the payment industry, tokenization is a common and well-known way to protect transactions and payment data of point-of-sales systems and applications. Recently, the method has started to spread to other industries and to new use cases, especially in the cloud.

The funny aspect is that tokenization is a very old concept. However, it is now utilized for new problems either to replace or complement other means to protect data.

Tokenization is a way to obfuscate sensitive data such as a social security number by converting it into a random string of characters. That string is called a token.

Tokens are stored in a database called a token vault, which stores the relationship between the original piece of data and the token. Whenever someone tries to access tokenized data, the token handler will verify whether he/she has the right to read detokenized data.

Tokenization ≠ encryption

Some might ask if tokenization is just a version of encryption. It isn’t.

Encryption obfuscates data by a complex mathematical algorithm. In symmetric key encryption the very same algorithm can also be used for returning ciphered data back into the plain text value.

Benefits of tokenization:

  • No way to reverse engineer tokens to original data.
  • Stolen tokens possess no risk to data security.
  • Low demand for computing.
  • Suitable even for IoT security.

Tokenization involves no keys but only random characters. If an intruder steals tokens, they are useless and harmless because there is no way to reverse engineer the real data behind a token. In case of a data breach, this improves cyber security.

Additionally, encryption is always a computing intensive process whereas tokenization requires very little processing power and is very fast. That makes tokenization useful even in IoT security, because typical IoT devices have weak processors and require low power consumption.

Analyze data assets with little risk

For organizations, tokenization offers a number of benefits especially to protect personal data in the cloud.

It may be used either as an option to encryption or in a combination with encryption depending on the requirements of the use case or regulation. In many cases, neither encryption nor tokenization alone is enough or a viable solution.

Four use cases drive tokenization at the moment:

  • GDPR: tokenization help ensure compliance and minimize data breach risks.
  • Cloud: tokenization help protect identities in cloud services.
  • Analytics: tokenization provides a method to pseudonymize big data assets of personal data and to enable data processing.
  • Application development outsourcing: tokenizing data makes it easier to give data assets for outsourced developers.

GDPR is clearly the main factor that has brought tokenization to discussions during the last couple of years. Tokenization provides a relatively simple method to obfuscate personal data and thus fulfill regulatory requirements.

Cloud usage may be the single biggest driver of tokenization. Many organizations have data of its own staff and/or customers that is partially sensitive and that identifies individuals, but they also have a necessity to use data in cloud-based applications and transactions. A secure way to use such data is to convert the sensitive fields into tokens before transmitting it into the cloud. Tokenization can be performed by in-house applications, or by tokenization as a service platforms offered by some cloud security vendors.

A third big driver is related to both GDPR and cloud. Organizations know that their massive data assets with personal and enterprise data would provide extremely valuable insight for reaching their business targets, but performing analysis of such data requires careful risk management. If sensitive data is converted into tokens, it ceases to be personal data, and it can be analyzed en masse without risks.

Finally, application development often requires data assets, even sensitive ones. But a big headache is trust, if development is outsourced as it usually is. Once sensitive data is tokenized, you are safer.

It’s easy to understand why the use of tokenization is increasing. This efficient method has certain drawbacks, but organizations can benefit from a hybrid strategy of tokenization and encryption to simplify cyber security – and to gain more value from the data assets.

Are you interested to know more on protecting your assets by tokenization? Contact Tieto Security sales experts, or me.

Stay up-to-date

Get all the latest blogs sent you now!