Tokenization

Tokenization is the process of substituting a sensitive data field with a non-sensitive equivalent (a token) that has no value to an attacker.

Tokenization systems use random number generators for numeric field types like account and credit card numbers to ensure their is no relationship between the token and the original data. The goal is to minimize the exposure of live data to applications and processes that do not require it. In order to retrieve the original value (detokenize) the token system maps the token and the original value in a database that is secured using encryption and other security controls. Tokenization is often used to protect payment card data due to scope reduction benefits for PCI DSS compliance.

Learn more about tokenization of your sensitive cloud data.