Security

Where the Shadows Lie

Shadow, APIs, AI, IT, Data… There's a lot of risk lurking in the shadows. Shadow technology can lead to data exposure, vulnerabilities, compliance risks, and plain old inefficiencies if not properly dealt with. Organizations rightfully want to get on top of the problem and are seeking help.

David Schlesinger
Chief Architect
Published on
2/27/2025
4
min.

Shadow technology, regardless of name, is a manifestation of the same issue -  unmonitored, unauthorized, or hidden technology operating outside official oversight. Over the past ten years, with the adoption of SaaS services, Shadow IT became a significant concern for security teams. With the more recent explosion of AI tools, we’ve started to hear the term Shadow AI being used for the same reasons.

A shadow that has taken shape

Shadow AI can include any AI usage that organizations don’t have visibility into. Specifically, it can probably be broken down into a few categories:

  • AI usage in existing third-party SaaS tools. The SaaS tools you’re already using have integrated AI systems into their platform, introducing problems that were not initially anticipated when purchasing said SaaS tool. For example, is my data being used for training, and was I automatically opted into it? Unfortunately, platforms are not always explicitly asking for consent.
  • Use of AI chatbots or decision-making systems without IT governance. Employees might be pasting sensitive data into their personal AI tools—for example, ChatGPT—or using AI-assisted code editors.
  • AI integrations in your own platform. This can be as simple as deciding to use Anthropic’s or OpenAI’s APIs in your own platform, or really any vendor that has incorporated AI into their system where you’re making API calls.
  • Internally developed AI models built without proper governance.

A new power is rising, or is it just the old one in disguise

Prior to the coining of Shadow AI, a major concern of ours at Riscosity was Shadow APIs. Shadow APIs are essentially untracked, undocumented, or unmonitored APIs. They’re often the result of quick fixes, workarounds, or tech debt. I mentioned earlier that shadow technology is a manifestation of the same issue, but I think this is especially true for Shadow AI and Shadow APIs.

AI APIs are the backbone of Shadow AI. Most platforms that have integrated AI have done so with third-party APIs rather than existing models. When employees use AI APIs without oversight, they are essentially creating Shadow APIs. Even if you develop an internal model, any application you build on top of that model is likely using APIs to communicate with it. If you want to gain visibility into the output of your model, you can do so through your APIs.

In essence, the solution you put in place to monitor data flow doesn’t need to know you’re using AI because, ultimately, sending sensitive or confidential information to an external system is enabled through the same mechanisms—regardless of whether the destination is using AI or not.

You shall not pass! How we stop Shadow AI before it spreads

At Riscosity, we aim to give organizations a bird’s-eye view of all sensitive data being passed to third-party vendors. We want our customers to know who their vendors are, what data is being passed to said vendors, and to provide the tools to govern data in motion.

Our platform was designed to tackle the risks of Shadow AI even before the problem had a name. We take a surgical approach to data governance, equipping teams with the visibility they need to discover AI and API usage within their infrastructure—without disrupting their day-to-day workflows. By continuously scanning code, analyzing DNS, and, through our no-code, no-agent active governance engine, we provide teams not only with the knowledge but also the tools to quell shadow technology before it escalates into a problem that requires active mitigation.

If you are interested in learning more about how Riscosity can protect your organization against Shadow AI and APIs, show us the meaning of haste and reach out to us at hello@riscosity.com.