The Clause That Doesn't Need a Law

On March 6, 2026, the General Services Administration published a proposed contract clause — GSAR 552.239-7001, "Basic Safeguarding of Artificial Intelligence Systems" — through a comment process normally reserved for procurement schedule refreshes. Not through Congress. Not through an executive...

The Clause That Doesn't Need a Law

On March 6, 2026, the General Services Administration published a proposed contract clause — GSAR 552.239-7001, "Basic Safeguarding of Artificial Intelligence Systems" — through a comment process normally reserved for procurement schedule refreshes. Not through Congress. Not through an executive order. Through a contracting mechanism.

The public comment period closed three days ago. By the time the clause is incorporated into MAS Refresh 31 and contractors accept the mass modification — expected within 60 days — it will have imposed compliance requirements more specific and operationally binding than anything produced by four years of federal AI governance debate.

That is not a design flaw. It is the design.

What the Clause Actually Does

GSAR 552.239-7001 binds any contractor that uses AI in the performance of a GSA Schedule contract — including commercial AI systems embedded in internal workflows, operated by subcontractors, or licensed from third parties. The clause reaches deliberately past the prime contractor: even if OpenAI, Google, or Microsoft is not a party to the contract, the prime contractor is responsible for their compliance.

Core obligations include disclosing all AI systems within 30 days of award, using only "American AI Systems" (prohibiting components "manufactured, developed, or controlled by non-U.S. entities"), enabling human oversight with traceable intermediate steps, reporting security incidents within 72 hours, and certifying deletion of all government data at contract close.

The intellectual property provisions are where the clause becomes most consequential. The government claims ownership of all "Custom Developments" — defined to include configurations, fine-tuning, prompt libraries, retrieval-augmented generation indexes, workflow integrations, and any enhancements to an AI system made in contract performance. The contractor retains the base model. The government keeps the derivative layer.

Every prompt engineering investment, every fine-tuned workflow, every RAG system built to serve a federal contract belongs to the government — not the vendor.

The Anthropic Precedent Under the Hood

The clause was not written in a vacuum. Shortly before its release, federal agencies were reportedly directed to cease using Anthropic technology following a dispute with the Department of Defense. The Pentagon demanded that Anthropic allow its model to be used for "all lawful purposes" without restriction. Anthropic resisted. Anthropic lost federal access.

GSAR 552.239-7001 codifies that expectation into contract language. The clause explicitly prohibits AI systems from refusing to produce outputs or perform analyses "based on the contractor's or Service Provider's discretionary policies." Safety architectures, content moderation guardrails, model refusals — none can override lawful government use.

The practical implication: any AI vendor whose models include safety restrictions that limit certain queries must either create a separate government-facing configuration or exit federal contracting. The clause provides the mechanism. The Anthropic dispute provided the precedent.

Procurement as Regulation

The deepest feature of GSAR 552.239-7001 is structural, not substantive.

It is not a law. It was not passed by Congress, signed by the President, or subjected to notice-and-comment rulemaking under the Administrative Procedure Act. It emerged from OMB Memo M-25-22 (April 2025), which directed agencies to develop AI procurement terms — itself an executive directive, not a statute. The clause explicitly takes precedence over any conflicting commercial terms, including end-user license agreements that contractors had previously agreed to with AI providers.

A procurement clause with a seven-day comment window now overrides bilateral commercial contracts that vendors spent years negotiating.

This is the pattern. Not legislation — procurement. Not regulation — contracting. Not formal rulemaking — schedule refresh.

The federal government controls roughly $700 billion in annual procurement. The MAS program reaches nearly every commercial software and services category. A clause baked into MAS contracts propagates across the vendor ecosystem faster than a statute could, with fewer procedural constraints and less political visibility. The EU AI Act took four years to pass. GSAR 552.239-7001 took four months to propose.

The "Unbiased AI" Problem

The clause's weakest section may prove its most consequential. Contractors must make "commercial efforts" to ensure AI systems adhere to "Unbiased AI Principles" — defined, per executive order, to prohibit "ideological dogmas such as Diversity, Equity, Inclusion." The government retains the right to conduct automated assessments of deployed AI systems at any time and suspend use pending performance issues.

No definition of "performance issues" is provided. No technical standard for "unbiased" is specified. The government may terminate a contract for failure to comply with these principles and hold the contractor responsible for "reasonable decommissioning costs," also undefined.

This section creates a mechanism by which any administration can exert behavioral pressure on deployed AI systems without passing a law, amending a rule, or publishing explicit criteria. The compliance burden is on the contractor. The assessment methodology belongs to the government.

The Supply Chain Trap

The service-provider responsibility provision pushes accountability downstream in ways the industry is not built to absorb. A prime contractor on a GSA Schedule is now responsible for ensuring that the commercial AI model powering their workflow complies with every clause requirement — even though that provider is not a party to the contract and has its own standard terms that cannot typically be individually negotiated.

The structural choice for every MAS contractor: renegotiate commercial AI agreements to extract compliance guarantees that commercial providers have never offered at scale, or stop using commercial AI in federal contract work. Smaller contractors, lacking the leverage to renegotiate with hyperscalers, face the steepest burden. The clause may inadvertently accelerate consolidation toward vendors who build AI specifically for government — a narrower, more controlled, less competitive market.

This outcome, notably, would benefit exactly the incumbents who can absorb compliance overhead.

What This Is

GSAR 552.239-7001 is the most operationally consequential piece of AI governance infrastructure issued in the United States to date — and it entered public life as a footnote to a procurement schedule refresh.

That is not incidental. In the absence of federal AI legislation, executive agencies have found that procurement law is a faster, more tractable path to behavioral regulation. The comment period closes. The schedule refreshes. The mass modification propagates. Sixty days later, the rules are live — binding, precedent-setting, and largely invisible outside the contracting community.

Congress has been debating federal AI legislation for three years. GSA has been writing it for four months.

Whether that produces better governance, or simply governance with less accountability, is the question the clause does not answer.

---

*Sources: Crowell & Moring client alert (March 2026), JD Supra analysis, Wiley Rein alert, Burr & Forman analysis, OMB Memo M-25-22 (April 2025)*