DPO & compliance
The file that helps the DPO say yes.
Inference stays on your machine. Control plane and chat history are EU-hosted with explicit retention policies. OwnLLM clarifies where data flows, who can access the service, how keys are revoked, and what stays under customer control.
Book an AI cost auditAccess control
SSO
SAML/OIDC on Startup. SCIM 2.0 and admin 2FA required on Enterprise (on the roadmap).
Logs
90d+
Audit metadata on Startup. 12-month retention with CSV/API export on Enterprise (coming soon).
Inference
Local
Models run on the customer's paired machine. Prompt content is not stored in audit logs by default.
Compliance objections handled before the call
Employee offboarding
Centralized API key revocation is available today. SCIM deprovisioning automation is on the roadmap.
AI responsibility
OwnLLM provides the infrastructure. The customer remains responsible for the use cases and models it enables.
DPA and subprocessors
EU subprocessors for the control plane, standard DPA, and customizable Enterprise version.
DPO questions and short answers
Where is the data?
Inference on your machine. Control plane EU-hosted.
Not fully air-gapped — chat history transits OwnLLM servers.
Configurable retention + per-tenant encryption at rest.
Who can access it?
SSO, admin/member roles, API scopes.
Magic link only on Team plan.
SSO on Startup. SCIM provisioning on Enterprise (coming soon).
What do you log?
Usage, model, tokens, timestamp, channel.
No prompt content in audit by default.
Useful audit metadata without storing prompt content.
What about Cloudflare risk?
Outbound tunnel, TLS, per-tenant shared secret.
Today, OwnLLM relies on Cloudflare for the tunnel.
Self-hosted relay path available once scale justifies it.
Compliance is the sales lever, not the blocker.
For regulated SMBs, OwnLLM turns 'can we use AI?' into 'which machine and which policies do we enable?'. Talk to us about your DPO's specific concerns.