Our Blog

Tokenization: the bid to improve security in the credit card processing chain

The concept behind tokenization is simple: hackers can’t steal what’s not there. The credit card industry has been intrigued about the concept’s potential to improve card security ever since it was revealed by Shift4 at an industry security summit in Las Vegas in 2005.

Tokenization

Advocates of tokenization will claim that its use can eliminate the presence of card PAN data within a bank’s, or merchant’s, environment.

From a security standpoint, stakeholders in the credit card processing chain are bound by PCI compliance: ensuring that sensitive cardholder data is always secure when it is being stored, transmitted and processed.

A growing number of retailers already use tokenization as a way to reduce PCI scope, and several vendors sell tokenization products and services. EMVCo have also taken the step of defining a tokenization specification that defines how tokenization providers can interoperate with card schemes.

Critics say that tokenization only ticks one box of PCI DSS compliance: the secure storage of sensitive data. However, this alone is still an extremely beneficial element of the tokenization concept.

What is tokenization?

Tokenization is essentially a process where the PAN is replaced by a token for transaction purposes. The token has no relationship with the PAN data it replaces. The token can be the same length and format as the original PAN, so it appears no different than a standard payment card number to back-end transaction processing systems, applications and storage.

The tokens are – typically – sent to a merchant’s payment processor for decryption and authorization.

The processor issues a token representing the entire transaction back to the retailer while the actual card number itself is securely stored in a virtual vault. Tokens can only be reversed back to their original values with access to an original “look-up” table. This table matches the tokens to their original values.

A customer’s card data is encrypted when the card is swiped through a payment terminal, sent to the processor where it is decrypted for transaction approval processes, and a token is issued to the merchant all without the customer experiencing anything different.

This website provides a superb explanation of a cloud tokenization system.

Why use it?

The benefits of tokenization to banks/merchants include:

  • Sensitive cardholder data stays within the bank’s/merchant’s control at all times. No external systems have access to the real data.
  • Strong tokens have unique security strength as they are not mathematically linked to the original value they replace – a token is created sequentially but from a random set of numbers and symbols.
  • Tokens can also be produced maintaining the same structure and data type as their original values (i.e. same format (numbers); same length (16 digits), and using the last four digits).

Tokenization and PCI compliance

Tokenization reinforces the overall objectives of PCI standards, as well as specific requirements. Most importantly, it addresses PCI requirement 3: “Protect stored cardholder data.” By returning only tokens in response to merchant requests, cardholder information need not be stored on the merchant system. Nor do merchants have to worry about ensuring that the method of encryption used is of adequate strength and complexity. And, since the merchant is not required to encrypt the token, there are no encryption keys to manage.

PCI requirement 3.4 mandates that all cardholder data be rendered unreadable anywhere it is stored using one of several forms of strong encryption. One of the methods suggested is the use of truncation. Since only the last four digits of the card number are used in the token, the tokenization process meets this requirement.

Since the merchant is not using encryption to derive the token, the tokenization process renders requirements 3.5 and 3.6 moot. Requirement 3.5 states that all encryption keys should be protected against disclosure and misuse. The token is sent to the merchant, rather than derived by the merchant using encryption keys. Tokenization reduces the need for comprehensive key management and also reduces the potential points of attack in a system.

Tokenization v. encryption:

The aim of data encryption and tokenization is to help prevent cybercriminals from hacking sensitive consumer data and to provide fraud protection for retailers, banks and consumers. The encryption method of card security is a form of obfuscation, rendering sensitive data unreadable. A key is used to return the obfuscated data back to its original form. Point-to-point encryption (P2PE) protects card data from the moment of card entry into, for example, a POS terminal.

Tokenization, however, is not mathematically linked to the original data so the tokens cannot be reversed back to their original values without access to the original “look-up” table that matches them up to their original values. These tables are typically kept in a database in a secure location inside a company’s firewall.

With tokenization, an external card processing system is not aware that it is using a surrogate credit card number (the token). With encryption an external system is aware of the fact that it is dealing with an obfuscated credit number and thus needs to decrypt it with a key. Ramon Krikken, Research VP with Gartner, in this 2011 interview doesn’t believe that you can have a tokenization implementation without encryption: “Ultimately, the real data needs to live somewhere, and that data needs to be protected. You can’t tokenize the token. There are only so many steps you can take. That’s why you can’t really do one without the other.”

Great for the travel and entertainment industries

Tokenization is great for industry systems that do not need to retain PAN data such as the travel and entertainment (T&E) industries.

A weakness in the system

Tokenization is not immune from strong criticism. The author Slava Gomzin – in his book Hacking Point of Sale: Payment Application Secrets, Threats, and Solutions (Feb 2014) – dedicates a section to tokenization titled the ‘Fallacy of Tokenization’. Gomzin argues that tokenization has a huge drawback that it cannot overcome in that “it cannot protect sensitive authentication data as the payment processor and acquirer have to have the original data in order to authorise the payment.”

Tokenization, claims Gomzin, is only suitable for the storage of sensitive cardholder data and does little or nothing for protecting data that is processed and transmitted.

The simple fact is that the original credit card numbers still have to be stored somewhere. They are either outsourced to a vendor to store or the numbers are stored in token vaults within internal systems. Either way the protection of those systems remains critically important. However, from the point of view of retailers, at least the protection of this data becomes somebody else’s problem once payment details are tokenized.

What now . . .

Tokenization is growing in popularity, especially in the United States where there is an ongoing debate over the scheduled introduction of EMV cards. Visa and MasterCard want merchants and banks to start accepting EMV cards from October 2015. Some industry experts advocate tokenization instead of chip-and-pin. In April 2014 Visa executive William Sheedy told attendees at the Electronic Transactions Association (ETA) conference that chip cards debuting in the United States will use ‘tokenization’ to make mobile and online transactions more secure.

Where Aviso comes in

Aviso’s Novate payment switch is built with tokenization in mind: Novate can access token vaults on the fly in order to dynamically replace tokens with real card numbers for authorization and settlement messages. This allows existing point of sale systems to be seamlessly updated in order to support tokenization, without a wholesale replacement of a retailer’s payment systems.

Contact us

For more information on our products and services contact us at info@aviso.io, or follow us on Twitter and LinkedIn.

Related posts