What is tokenization?
In data security, tokenization is a process using which sensitive data is replaced with a non-sensitive and random value, such that one can detokenize from the random value to the original value using the tokenization system. Yet, it is infeasible to get back the original sensitive data from the random data without the tokenization system. The non-sensitive and random value used to replace the sensitive data is called a “token” and the system that performs the processes of tokenization and detokenization is called a tokenization system.
In this article, we will discuss :
-
What is tokenization?
-
How does tokenization work?
-
How does Credit Card Tokenization work?
-
What are Durable Tokens and Transaction-Based Tokens?
-
What is the difference between tokenization and encryption?
-
What is the difference between Format Preserving Encryption and tokenization?
-
What is the difference between tokenization and data masking?
-
Vaulted vs Vaultless Tokenization
-
What is vaulted tokenization, and how does it work?
-
What is vaultless tokenization, and how does it work?
-
Vaulted or vaultless tokenization – which one is better?
-
0 Comments