Skip to main content

You Probably Haven't Heard of the EU AI Act. You Already Have Obligations Under It.

18 March 2026·The ThoughtFox TeamAI Governance & Ethics
You Probably Haven't Heard of the EU AI Act. You Already Have Obligations Under It.

Most organisations using AI tools at work have never heard of the EU AI Act.

That's not a criticism. It's complex, it moves fast, and until recently it felt like something for tech companies to worry about.

But if your team uses AI in their day-to-day work, whether that's Copilot, ChatGPT, or the AI features already built into your CRM, ERP, or HR platform, your organisation already has legal obligations under it. Not from August 2026. Some of them are in force today.

The EU AI Act draws a clear line between providers (the companies that build AI systems) and deployers (the organisations that use them in their work). Most organisations assume their exposure is limited to the obvious tools. It rarely is. If your software vendor has added AI features, and most have, you are a deployer.

Article 4 is already active. It requires that everyone who uses AI in their role has an appropriate level of AI literacy. Not a one-hour course ticked off a compliance list. A genuine understanding of how the tools work, where they fail, and when human judgement must take over.

Most organisations we speak with are nowhere near that standard.

Many don't know the obligation exists at all.

August 2026, when full enforcement begins, is closer than it looks.

If you're trying to get clarity on your EU AI Act obligations, particularly around AI literacy, we run a short discovery call specifically for that conversation. No pitch. Just a clear picture of where you stand. Let's talk.

Found this useful?

Share on LinkedIn

Want to go deeper?

Our articles explore these topics in depth — frameworks, case studies, and practical guidance.

Read our articles