The IAM SME

Security – Identity – Cyber – Governance

Advertisement

EU AI Act Explained: What It Means for Microsoft Admins and How to Stay Compliant

Gary C

Gary C 

CISSP | CGEIT | CISM | CRISC | MBA |

November 27, 2025

If you thought GDPR was the Everest of compliance, brace yourself: the EU AI Act is here, and it’s less of a hill and more of a mountain range. This regulation, formally known as Regulation (EU) 2024/1689, entered into force on 1 August 2024, with phased obligations kicking in from February 2025 and full force by August 2026. Its mission? To make AI safer, more transparent, and less likely to accidentally overthrow democracy while optimising your Teams meetings.

What Is It?

The EU AI Act is the world’s first comprehensive AI law. It applies to anyone developing, deploying, or importing AI systems that impact people in the EU. It uses a risk-based approach, which we’ve translated into a handy colour-coded table for those who like their compliance with a splash of colour:

Article content
A RAG Approach

Why Does It Exist?

Because AI is brilliant at solving problems, except when it creates bigger ones. The Act aims to protect fundamental rights, ensure safety, and stop rogue algorithms from deciding who gets a mortgage based on their taste in 1980s synth-pop b-sides.

When Does It Bite?

  • Feb 2025: Prohibited practices banned; AI literacy obligations start.
  • Aug 2025: General-purpose AI (GPAI) rules apply.
  • Aug 2026: Full compliance for high-risk AI systems.

Five Ways Microsoft Admins Can Stay Compliant Using the Microsoft Security Stack

  1. Inventory Your AI Systems with Microsoft Purview
  2. Implement Role-Based Access Control via Microsoft Entra
  3. Enable Continuous Monitoring with Microsoft Defender for Cloud
  4. Audit Transparency Obligations Using Microsoft Sentinel
  5. Document Everything in Compliance Manager

KQL Queries to Keep You Out of Trouble

Here are some practical queries for Microsoft Sentinel:

1. Find AI-related service principals with excessive permissions

Article content

2. Detect GPT-like activity in Teams (because someone will try)

Article content

3. Check for missing transparency tags in chatbot interaction

Article content

4. Monitor high-risk AI app sign-ins

Article content

5. The Geeky One

Article content

Specialist AI Tooling addons?

AI Security Posture Management – Microsoft’s Latest Acronym Buffet

Microsoft, never one to shy away from inventing new acronyms, has introduced AI Security Posture Management (AI‑SPM). The idea is simple enough: if your organisation is dabbling in generative AI, you probably want to know what’s running, where it’s running, and whether it’s about to leak sensitive data faster than a sieve in a rainstorm. AI‑SPM promises to discover AI apps and agents across your estate, assess their risks, and give you a neat dashboard to prove to auditors that you’ve at least looked at the problem.

Now, before you ask: is this included in my standard Microsoft 365 licence? Of course not. This is Microsoft we’re talking about. AI‑SPM lives inside Defender for Cloud, which is part of the broader 365 security suite. In other words, it’s an add‑on, you won’t find it bundled with your vanilla Microsoft 365 subscription, no matter how many times you refresh the admin portal. Think of it as the optional side dish you didn’t order but now feel guilty about ignoring, because everyone else at the table has one.

Key Features (or “Why You’ll End Up Paying for It”)

Discovery: Finds AI workloads you didn’t know existed. Useful if your developers have been “experimenting” without telling you.

Risk Assessment: Flags misconfigurations and vulnerabilities, so you can nod knowingly in meetings.

Multi-cloud Coverage: Works across Azure, AWS, and Google Cloud. Because apparently one cloud wasn’t stressful enough.

AI Bill of Materials: Produces an inventory of AI components, which sounds glamorous until you realise it’s basically a glorified spreadsheet.

Verdict

AI‑SPM is not a freebie tucked into your Microsoft 365 licence. It’s an Azure‑side add‑on, designed for organisations that want to keep their AI experiments from turning into compliance nightmares. Whether you adopt it depends on your appetite for risk, your budget, and how much you enjoy deciphering Microsoft’s ever‑expanding menu of security acronyms.

AI‑SPM vs. Security Copilot: Spot the Difference

Microsoft now offers two shiny toys in the AI security playground: AI Security Posture Management (AI‑SPM) and Security Copilot. Both sound impressive, both have glossy marketing slides, and both will cost you extra (but SCP is actually now bundled with the full E5 stack). But they’re not the same beast.

AI‑SPM: Think of it as the nosy neighbour who insists on cataloguing every AI workload in your house. It discovers, audits, and nags you about risks. It’s posture management — the digital equivalent of being told to sit up straight.

Security Copilot: This one’s more like the chatty friend who turns up uninvited, drinks your tea, and then offers unsolicited advice. It uses generative AI to help security teams make sense of alerts, incidents, and logs. Less about posture, more about “what on earth just happened?” or “do my current CAPs still fit my head?”.

The Key Differences

Article content

Verdict

AI‑SPM is the stern schoolteacher reminding you to tidy up your AI projects. Security Copilot is the slightly over‑enthusiastic colleague who insists they can “help” with incident response. Both are useful, both are extra, and both will ensure your finance department develops a twitch whenever you mention “licensing”

Another Thought

Regulatory Reality Check – EU AI Act and the UK’s Musings

Of course, no discussion of AI security tools would be complete without a nod to the regulators. The EU AI Act has already shuffled onto the stage, armed with categories of risk and a determination to make sure your AI doesn’t accidentally oppress anyone, misdiagnose a patient, or impersonate your boss. It’s the first comprehensive attempt to regulate AI across Europe, and it means organisations will need to demonstrate that they’ve thought about risk, compliance, and transparency. Cue Microsoft’s AI‑SPM, which suddenly looks less like an optional side dish and more like the vegetables you’re told to eat for your own good.

Meanwhile, the UK is still deciding whether to follow suit with a full‑blown AI Act or stick to its current “light‑touch, sector‑based” approach. Ministers have promised to keep things “pro‑innovation” which is Whitehall‑speak for “we’ll get around to it eventually, probably after another consultation or three.” In the meantime, British organisations are left in the awkward position of needing to prepare for EU rules if they operate across borders, while also second‑guessing what Westminster might eventually cook up.

Why It Matters

• AI‑SPM and Purview DSPM for AI give you the audit trails and risk assessments regulators will expect.

• Security Copilot helps you explain what went wrong when auditors inevitably ask.

• Together, they’re less about shiny dashboards and more about proving you’ve done your homework when the compliance examiners come knocking.

Closing Thoughts – Compliance, Acronyms, and the Joy of Regulation.

The EU AI Act has already set the tone, demanding that organisations treat AI with the same seriousness as data protection. Risk categories, transparency obligations, and compliance checks are no longer optional extras; they’re baked into the legislation. For anyone operating across Europe, this means demonstrating that your AI workloads are properly inventoried, assessed, and controlled. Microsoft’s AI Security Posture Management (AI‑SPM) and Purview DSPM for AI suddenly look less like shiny add‑ons and more like survival gear.

Meanwhile, the UK continues to flirt with the idea of its own AI Act, preferring consultations and “pro‑innovation” soundbites to actual legislation. Whether Westminster eventually follows Brussels or carves out its own path, British organisations will still need to prove they’ve done their homework, especially if they operate in both jurisdictions.

In short, Microsoft’s AI posture tools are not just about dashboards and acronyms; they’re about preparing for a regulatory landscape that is already here in the EU and looming in the UK. Ignore them at your peril, or embrace them as the compliance insurance policy you’ll wish you had when the auditors come knocking.

Call to Action: If your organisation is experimenting with AI, now is the time to explore AI‑SPM and Purview DSPM for AI. Better to get ahead of the regulators than wait for them to write your to‑do list. Speak to me if you need help!

Gary Clarke Head of Security – IAM-SME