Skip to main content
POST
/
identities
/
:id
/
tokenize
curl -X POST https://YOUR_BLNK_INSTANCE_URL/identities/idt_1234567890/tokenize \
  -H "Content-Type: application/json" \
  -H "X-Blnk-Key: YOUR_API_KEY" \
  -d '{
    "fields": ["first_name", "last_name", "email_address"]
  }'
{
  "message": "Fields tokenized successfully"
}
Tokenizes multiple fields in an identity record in a single operation. This is more efficient than tokenizing fields individually when you need to protect multiple PII fields.

Path Parameters

id
string
required
The unique identifier of the identity

Request Body

fields
array
required
Array of field names to tokenize. Each field must be one of the tokenizable fields.Example: ["first_name", "last_name", "email_address", "phone_number"]

Tokenizable Fields

Only the following fields can be tokenized:
  • first_name (or FirstName)
  • last_name (or LastName)
  • other_names (or OtherNames)
  • email_address (or EmailAddress)
  • phone_number (or PhoneNumber)
  • street (or Street)
  • post_code (or PostCode)

Response

message
string
Success message confirming the fields were tokenized
curl -X POST https://YOUR_BLNK_INSTANCE_URL/identities/idt_1234567890/tokenize \
  -H "Content-Type: application/json" \
  -H "X-Blnk-Key: YOUR_API_KEY" \
  -d '{
    "fields": ["first_name", "last_name", "email_address"]
  }'
{
  "message": "Fields tokenized successfully"
}

Behavior

Sequential Processing

Fields are tokenized sequentially in the order provided. If any field fails to tokenize:
  1. The operation stops immediately
  2. An error is returned
  3. Previously tokenized fields in the same request remain tokenized
  4. Fields not yet processed remain unchanged
Example: If you request ["first_name", "last_name", "email_address"] and last_name is already tokenized:
  • first_name will be successfully tokenized
  • Error returned for last_name (already tokenized)
  • email_address will not be processed

Atomicity Consideration

This operation is not atomic. If it fails partway through, some fields may be tokenized while others are not. Check the Get Tokenized Fields endpoint to verify the current state.

Use Cases

New Customer Onboarding

Tokenize all PII immediately after customer registration:
// After creating identity
const response = await fetch(
  `https://YOUR_BLNK_INSTANCE_URL/identities/${identityId}/tokenize`,
  {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'X-Blnk-Key': 'YOUR_API_KEY'
    },
    body: JSON.stringify({
      fields: [
        'first_name',
        'last_name',
        'email_address',
        'phone_number',
        'street',
        'post_code'
      ]
    })
  }
);

Compliance Migration

Migrate existing records to tokenized format:
// Bulk tokenize existing identities
const identities = await fetchAllIdentities();

for (const identity of identities) {
  // Check which fields need tokenization
  const untokenizedFields = getUntokenizedPII(identity);
  
  if (untokenizedFields.length > 0) {
    await tokenizeFields(identity.identity_id, untokenizedFields);
    await sleep(100); // Rate limiting
  }
}

Selective Protection

Tokenize only contact information while keeping name fields accessible:
{
  "fields": ["email_address", "phone_number"]
}

Best Practices

1. Verify Before Tokenizing

Check the current tokenization state before attempting to tokenize:
const { tokenized_fields } = await getTokenizedFields(identityId);
const fieldsToTokenize = ['email_address', 'phone_number']
  .filter(field => !tokenized_fields.includes(field));

if (fieldsToTokenize.length > 0) {
  await tokenizeFields(identityId, fieldsToTokenize);
}

2. Handle Errors Gracefully

Implement retry logic with exponential backoff:
async function tokenizeWithRetry(identityId, fields, maxRetries = 3) {
  for (let i = 0; i < maxRetries; i++) {
    try {
      return await tokenizeFields(identityId, fields);
    } catch (error) {
      if (i === maxRetries - 1) throw error;
      await sleep(Math.pow(2, i) * 1000);
    }
  }
}

3. Batch Processing

When tokenizing many identities, process in batches with rate limiting:
const BATCH_SIZE = 10;
const DELAY_MS = 100;

for (let i = 0; i < identities.length; i += BATCH_SIZE) {
  const batch = identities.slice(i, i + BATCH_SIZE);
  
  await Promise.all(
    batch.map(identity => 
      tokenizeFields(identity.identity_id, ['email_address', 'phone_number'])
    )
  );
  
  await sleep(DELAY_MS);
}

4. Audit and Logging

Log all tokenization operations:
await auditLog.create({
  action: 'TOKENIZE_FIELDS',
  identityId: identityId,
  fields: fields,
  userId: currentUser.id,
  timestamp: new Date(),
  reason: 'compliance_requirement'
});

await tokenizeFields(identityId, fields);

Performance Considerations

  • Processing Time: ~50-100ms per field
  • Database Updates: One update per field
  • Recommended Batch Size: 5-10 fields at a time
  • Rate Limiting: Consider limiting to 10 requests/second

Error Recovery

If tokenization fails partway through:
  1. Check which fields were successfully tokenized:
    GET /identities/:id/tokenized-fields
    
  2. Retry with only the failed fields:
    {
      "fields": ["remaining_field_1", "remaining_field_2"]
    }
    

Comparison with Single Field Tokenization

AspectMultiple FieldsSingle Field
API Calls1 requestN requests (one per field)
PerformanceFaster for multiple fieldsBetter for single field
Error HandlingStops on first errorIndependent failures
AtomicityNot atomicEach field independent
Use CaseBulk operationsSelective tokenization
Recommendation: Use this endpoint when tokenizing 2+ fields. For single fields, use Tokenize Field for better error isolation.

Security Notes

Ensure BLNK_TOKENIZATION_SECRET is set to a secure 32-byte value before using tokenization features.
  • All tokenization operations are logged (recommended)
  • Tokenization is irreversible without the encryption key
  • Backup your encryption key securely
  • Rotate encryption keys periodically (requires re-tokenization)

Build docs developers (and LLMs) love