Compliance, Data Analytics Security, Enterprise Data Protection, HPE Nonstop Security

Evaluating Data-centric Protection Solutions


This document is a guide for Enterprise Security Architects, Security Analysts, and CISOs evaluating and comparing tokenization solutions. Tokenization is an architecture model, not just a technology, nor simply an API. Successful tokenization implementations come from evaluating critical areas of concern across data security, architectural compatibility, scale, performance, operation, monitoring, compliance audit and integration. The business value of tokenization is high when it is successful, but as a business critical foundation technology, success will be short lived without thorough assessment up front beyond the commonly evaluated application interfaces and token format policies.

Back to overview