AI Bill of Materials (AI BOM)
Also called: AIBOM, AI Bill of Materials, model bill of materials
An AI Bill of Materials (AI BOM or AIBOM) is a structured inventory of every component used to build, train, and deploy an AI system: training data sources, foundation models, fine-tuning datasets, prompts, libraries, dependencies, and downstream integrations.
The AI BOM concept extends the software bill of materials (SBOM) idea — already mandated for federal software procurement under EO 14028 — to the AI lifecycle. Where SBOMs answer "what's in this binary," AI BOMs answer "what went into this model and where do its outputs go."
Typical AI BOM elements:
- Foundation model — name, version, vendor, license, training cutoff date
- Training data — sources, licensing terms, demographic coverage, known biases
- Fine-tuning data — internal datasets used for adaptation, with consent and provenance
- System prompts — the production prompt scaffolding that shapes model behavior
- Tool and function definitions — external APIs the model can invoke
- Output destinations — downstream systems consuming model output (databases, CRM, customer-facing UIs)
- Hosting — inference endpoint provider, data residency, retention policy
Standard formats are still emerging. CycloneDX has an AI/ML extension as of v1.5; SPDX is working on equivalent coverage. Manatee, ML BOM, and other community formats compete in the same space.
Why it matters
Without an AI BOM you can't answer regulator or customer questions about supply chain risk: "Where did the training data come from? What happens to our prompts? Which model version produced this output?" The EU AI Act explicitly requires technical documentation that maps to AI BOM concepts. NIST AI RMF's Map function presumes you have this inventory. Building it ad-hoc per audit is exhausting; building it once and maintaining it is the only sustainable path.
Related terms
Shadow AI
Shadow AI is the use of AI tools, models, or services inside an organization without IT, security, or governance team approval.
AI Governance
AI governance is the set of policies, processes, roles, and controls an organization uses to develop, deploy, and operate AI systems responsibly and in compliance with applicable laws, standards, and stakeholder expectations.
Trust Gap
The Trust Gap is the difference between an organization's self-reported AI security posture and the posture verifiable from independent evidence.
