ArmorPII

Safely access LLMs without losing your customer data

ArmorPII helps you filter personal customer information while you access LLM and rehydrates back the customer data into the generated response.
Supports pre-built and custom rules.
Self host on your private cloud.
No data leaves your domain. Easy plug and play!

Everything you need to securely start using LLMs

ArmorPII comes batteries included. Secure defaults mean you dont have to worry about leaking your customer PII to the big tech LLMs.

Pre-built Rules

Inbuilt PII redaction rules for common PII like SSNs, Phone Numbers and Names

Custom Regex Rules

Create custom regex rules to filter out your use case specific data

LLM-based filtering

Use local LLM to create abstract filters that dont follow regex rules. Like customer company names.

Broad Integration

Supports all the popular models from OpenAI, Anthropic, Gemini out of the box. Supports custom models also.

Rehydrate Responses automatically

Automatic rehydration of generated response with customer data so the responses are still customer specific. It just works!

Easy Plug and Play

ArmorPII can be one-click deployed to a cloud instance and you can start filtering your prompts in 1-2-3.

Works with your technologies

AI with Safety

Get setup quickly and access your favorite AI APIs from OpenAI, Anthropic, Gemini without leaking your customer data