What is Tokenization?
A token is a non-exploitable identifier that references sensitive data. Tokens can take any shape, are safe to expose, and are easy to integrate. Tokenization refers to the process of storing data and creating a token. The process is completed by a tokenization platform and looks something like this:
- You enter sensitive data into a tokenization platform.
- The tokenization platform securely stores the sensitive data.
- The system provides a token to use in place of your sensitive data.