Production AI Institute · PSF v1.1 open standard
AI Right-To-KnowAI Data Use IndexCheck My AI ToolsPolicy Change WatchAgent ReadinessPublic BenchmarkContactGlobal standard · Worldwide
MSP and consultant session pack

A useful AI transparency session you can run this week.

The AI System Disclosure gives MSPs and consultants a clean first engagement: help the client publish basic AI transparency, then turn the gaps into concrete PSF evidence work.

Engagement posture

Do not sell fear. Publish clarity. Then help the client close the evidence gaps the disclosure exposes.

60-minute session

Run the meeting like a standards body, not a software demo.

0-10 minInventory

List where AI is already used: chat, email, documents, support, sales, code, finance, HR, and vendor tools.

10-25 minBoundary

Record people affected, data touched, autonomy level, owner role, human route, and incident process.

25-40 minDisclosure

Generate the AI System Disclosure and decide what should be public now versus improved before publication.

40-55 minGap map

Map priority actions to PSF domains: input boundary, output validation, observability, human oversight, security, and vendor resilience.

55-60 minNext move

Agree one evidence artifact to create this week: data boundary, escalation route, incident log, eval record, or fallback plan.

Client language

A simple script that lands with non-technical leaders.

You are probably already using AI in more places than you think. The goal today is not to shame anyone or block useful tools. It is to publish the basic facts a customer, employee, or regulator would reasonably expect to see.
After the disclosure

What stronger assurance usually requires next.

A disclosure is a public transparency record. When the system is consequential, the next work is usually concrete: workflow capture, controls, policy, evidence packs, incident exercises, vendor reviews, and formal assurance.