-
P100
LOADING...
#Tokenization In The Service Of Protecting Sensitive Data
#Data that needs protection
The business system of a company or organization often contains sensitive data, which includes personal and business information that, if disclosed or compromised, could cause adverse consequences for the individual or company to which the data relates.
Common examples of sensitive data are a person's genetic or biometric data, credit card information, health records, information about sexual orientation and sex life, financial information, trade-union membership, and more.
#What is tokenization?
Tokenization is the process of replacing sensitive parts of the original data set with non-sensitive data and storing the sensitive parts in a secure location on a trusted medium and out of reach of cyberattacks.
The non-sensitive replacement is called a token. A token refers to a set of alphanumeric characters that prevents sensitive user data from being compromised.
Tokenization is the process of replacing sensitive parts of the original data set with non-sensitive data and storing the sensitive parts in a secure location on a trusted medium and out of reach of cyberattacks.
The non-sensitive replacement is called a token. A token refers to a set of alphanumeric characters that prevents sensitive user data from being compromised.
#Tokenization against data breach
A tokenized system is considered a safe harbor for sensitive data. Even if such a system were to face a potential data breach, there is no valuable information to intercept or steal, as the sensitive data has already been tokenized and transferred to a more secure destination.
The tokens themselves have essentially no value and cannot be misused to cause any harm.
#Tokenization vs encryption
Although they represent different ways of data protection, tokenization and encryption are not mutually exclusive, on the contrary. Tokenization and encryption work in tandem and complement each other.
Sensitive data remains sensitive no matter how secure the remote destination is. It is highly recommended that such data be encrypted before it is stored for an extra layer of protection. Data decryption can only be done by a cloud-based application running on a centralized server that manages the entire tokenization and encryption/decryption process and using the appropriate crypto key pair.
Unlike encryption, tokenized data is undeciphered and irreversible, meaning that there is no key that allows the token to reveal the data it corresponds to. Detokenization exists in the sense of exchanging the token back for the original data, not in the sense of extracting sensitive data from the token.
On the one hand, encryption requires a reverse decryption process to close the loop on data protection, while tokenization explicitly prohibits such a process in its scope.
#Secure location for storing sensitive data
Sensitive data can be stored online or offline, permanently or temporarily, depending on the data storage strategy chosen and how often that data may be accessed for future use. A secure destination can refer to a remote database server, removable digital media or tapes, which have become indispensable in protecting sensitive files and information.
Data storage maintenance may or may not be outsourced, but it is advised that sensitive data and tokens are kept away from each other and not part of the same operating environment for security reasons.