Leaked Tokens, AI And The “Who’s Responsible?” Moment

AI
4 min read

The average data breach now costs $4.44 million globally, while U.S. companies face an eye-watering $10.22 million in damages. The kicker: organizations with high levels of “shadow AI” unauthorized AI tools used by employees pay an extra $670,000 in breach costs.

The AI governance gap is staggering: 63% of breached organizations either don’t have AI governance policies or are still developing them. Among companies that experienced AI-related breaches, 97% lacked proper AI access controls.

If you upload proprietary data into a public model, and that data reuses, who do you blame: the model or the hand that fed it? The uncomfortable answer: it’s probably on you. AI does what you ask. And it is still the human who decides what’s safe to ask.

This is the part many skip when rushing to build AI features. Privacy policies, data handling rules, token scoping – they sound like legal paperwork. But they define exactly how your systems behave when things go sideways. If you don’t have clear internal rules around how you manage sensitive data (and what ends up in prompts), you are walking into trouble blind.

Now, is that likely? Not with serious providers. OpenAI, Google, Anthropic and the rest have guardrails and enterprise-grade isolation for paid customers. But not every company uses those versions. Some still paste business-sensitive data into the free playground and call it a day. Others ship LLM integrations into production without sandboxing or prompt auditing.

The real risk is the mix of human shortcuts and powerful systems. When prompts, credentials, training sets and responses end up in the wrong place, the dominoes fall fast.

But if you set up clear privacy rules, lock down secrets, monitor usage and treat your AI tools like you’d treat your production servers, you are doing it safely. That’s the kind of mindset that turns “oops” into “nothing happened because we were ready.”

Maryia Puhachova
Maryia Puhachova

You may also like

Get advice and find the best solution




    By clicking the “Submit” button, you agree to the privacy and personal data processing policy