Talk:Tokenization (data security)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Tokenizing is the operation of replacing one set of symbols with another, typically to make the resulting set of symbols smaller.

This is not the common usage of the term. In computer science it normally means to split a string up into tokens (e.g. key words, separators, etc.), not to replace a list of tokens with smaller tokens.

I am not familiar with the usage tokenizing given in the previous version of the article, but I will leave it as an alternative meaning of tokenizing until I can verify whether it is incorrect or not.

Steve-o 03:47, 15 Apr 2004 (UTC)

Tokenizing in politics has a different meaning that would be worth adding to this article, or putting into another one.--Lizzard 22:47, 13 September 2006 (UTC)[reply]

I expected this page to be about lexical analysis, not security. 66.75.141.47 (talk) 08:09, 30 August 2009 (UTC)[reply]

the section on human perception[edit]

  • remote enough to be situated in a different article
  • needs a cite
  • was left in for the time being pending suggestions for a more appropriate location



WTF? Where is all the content!? I expected to see lexical tokenization...but this page is empty. I think someone screwed up. Actually, it's worse than I thought. Clicking on "Discussion" for Tokenization leads to this page, the one for Data Security. The main tokenziation page is empty: http://en.wikipedia.org/wiki/Tokenization

Incomprehensible[edit]

I don't feel I'm especially dim, but after reading the first (below) and subsequent few paras of this page, I'm none the wiser. How does it help to tell me that tokenization is done using tokenization?

"Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no extrinsic or exploitable meaning or value. The token is a reference (i.e. identifier) that maps back to the sensitive data through a tokenization system. The mapping from original data to a token uses methods which render tokens infeasible to reverse in the absence of the tokenization system, for example using tokens created from random numbers.[1] The tokenization system must be secured and validated using security best practices [2] applicable to sensitive data protection, secure storage, audit, authentication and authorization. The tokenization system provides data processing applications with the authority and interfaces to request tokens, or detokenize back to sensitive data." — Preceding unsigned comment added by Holytrinity1 (talkcontribs) 13:27, 23 March 2015 (UTC)[reply]

It is incredibly dumb to define a term using the term itself. This seems to be rampant in many areas. I'm not sure why this is happening. I attribute it to an inability to properly articulate ideas and an overall disinterest in linguistic precision.
That being said, (perhaps unnecessarily) I think every instance of "tokenization system" needs to be replaced with "translation system" or maybe "conversion system" which is what actually generally describes the process. A set of PII is converted to non-identifiable information and a third party validates the converted data.
The tokenizing describes the encapsulation and conversion of the data into in a string of unusable data. Kingram6865 (talk) 11:31, 27 January 2023 (UTC)[reply]


Reads like an Ad[edit]

This could well be just a copy/paste from any of those websites from the companies who sell the service. As any technology it has it's limits and calling it secure is... well subjective at best. I'd rather vehemently object to the articles objectivity. Unpaid advertisement comes to mind.

"This secure billing application allows clients to safely and securely process recurring payments without the need to store cardholder payment information. Tokenization replaces the Primary Account Number (PAN) with secure, randomly generated tokens." eNTi (talk) 09:20, 11 July 2018 (UTC)[reply]

It definitely is written like a marketing piece and has a WP:PEACOCK issue which I have tagged. I have also run a WP:COPYVIO check to make sure that it was not copied from industry marketing materials. The opposite appears to be the case, industry marketing materials are using excerpts from this article. ~Kvng (talk) 14:50, 10 October 2018 (UTC)[reply]

I've made an attempt to fix several of these peacock terms. [1] Please point out any remaining examples. Jehochman Talk 16:29, 29 September 2020 (UTC)[reply]

Apply Pay[edit]

I've removed an assertion that tokens used in Apple Pay could "be used in an unlimited fashion or even in a broadly applicable manner." This is factually incorrect. Apple Pay software utilizes the host device's Secure Element, which is a physical chip inside the device. The token stored on the device only has value when used by that same device, i.e., without the Secure Element, the token is useless. The removed assertion had no citation for me to refute further.

Source: https://support.apple.com/guide/security/secure-element-and-nfc-controller-seccb53a35f0/1/web/1

~ JDCAce (talk) 16:41, 23 May 2021 (UTC)[reply]