What Is Tokenization in the Cloud?
Tokenization is the process of substituting a piece of sensitive data element with a random, non-sensitive equivalent, referred to as a token. The token has no extrinsic or exploitable meaning or value. Applications and processes can operate with tokens the same way as they would with the original data. Unlike encryption (hyperlink here to the encryption hubpage), in tokenization there is typically no mathematical relationship between the original data and the token.
For example, a 16-digit credit card number 4362 4890 2300 8650 may get a replacement token that looks like this: 4362 04F5 3A0D 8650
Tokenization is often used in high assurance environments where it is critical to limit the exposure of the original data to applications, data stores, users, and processes, thereby reducing the risk of a data compromise. Another common driver for tokenization is data residency where regulations prevent data from leaving geographic boundaries.
A common method for tokenization systems is using a database to map tokens to original data and back. As such, the security, reliability, and availability of the database are of utter most importance. In a distributed environment, there is the additional requirement of keeping distributed token databases consistent to provide data integrity across data centers.
click to zoom
Take a Deeper Dive
- Tokenization may be used to safeguard sensitive data involving, for example, bank accounts, financial statements, medical records, criminal records, driver’s licenses, loan applications, stock trades, voter registrations, and other types of personally identifiable information (PII). Tokenization is often used in credit card processing. The PCI Council defines tokenization as “a process by which the primary account number(PAN) is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated PAN value. The security of an individual token relies predominantly on the infeasibility of determining the original PAN knowing only the surrogate value”. The choice of tokenization as an alternative to other techniques such as encryption will depend on varying regulatory requirements, interpretation, and acceptance by respective auditing or assessment entities. This is in addition to any technical, architectural or operational constraint that tokenization imposes in practical use.
- The security and risk reduction benefits of tokenization require that the tokenization system is logically isolated and segmented from data processing systems and applications that previously processed or stored sensitive data replaced by tokens. Only the tokenization system can tokenize data to create tokens, or detokenize back to redeem sensitive data under strict security controls. The token generation method must be proven to have the property that there is no feasible means through direct attack, channel analysis, token mapping table exposure or brute force techniques to reverse tokens back to live data.
Tokenization, as a data protection technology, has been gaining significance in the wake of Snowden’s revelation of government surveillance and several countries’ move to adopt data residency regulations. Read more about how the industry is considering tokenization technologies.
- Merchant community calls for an open and universal tokenization standard to protect payment card information
- Federal Reserve’s Mobile Payments Industry Working Group issues statement that sees tokenization as the solution to a number of problems with respect to mobile and electronic payment adoption
- Banks push for tokenization standard to secure credit card payments
- The Accredited Standards Committee X9 is working on a set of standards to govern tokenization
Articles on Cloud Data Tokenization
Read the latest updates, insights, tips and emerging trends for tokenization in the cloud
Compliance and Tokenization
Tokenization technologies and regulatory compliance are closely linked together. Because tokenization completely replaces sensitive values with random values, systems that use the token instead of the real value are often exempt from audits and assessments required by regulations, thus reducing duration and cost of deployment.
In an environment where the application that processes the data resides outside the regions to which personal data are permitted to transfer (often by privacy laws), tokenization allows you to use the application without violating data residency constraints, since the data processed by the application bears no relationship to and information of the original data.
Lastly, breach notification laws often do not apply when the tokenized data is compromised, provided that the token database is not disclosed. This is the case with many US State breach notification laws as well as with EU’s data protection directive.
Tokenization: Expert Series
DLP Piper’s privacy experts discuss meeting European Data Protection and Security Requirements with CipherCloud Solutions
Visa’s tokenization best practices guide is one of the earliest industry guides for secure tokenization.
Smartcard Alliance Payment Council’s whitepaper on encryption, tokenization discusses these technologies’ impact on payment fraud prevention.
Resources: Insights and Media About Cloud Data Tokenization