The Trump administration turned to artificial intelligence earlier this year to help identify Department of Veterans Affairs (VA) contracts for cancellation. But the tool in question—a quickly developed script known as “MUNCHABLE”—was built by Sahil Lavingia, a software engineer with no experience in healthcare, government operations, or artificial intelligence. Tasked with supporting the Department of Government Efficiency (DOGE), Lavingia created the AI tool within a day of joining the team in March.
The tool was designed to label “non-essential” contracts for potential elimination. Using older, inexpensive AI models, it evaluated over 76,000 VA contracts and flagged more than 2,000 for cancellation. But the system quickly drew criticism for producing wildly inaccurate results, such as miscalculating contract values—often mistaking small $35,000 contracts for $34 million ones—and misunderstanding the purpose of essential agreements.
Canceling Critical Services with Limited Oversight
Although VA officials insist that no decisions were made without human review, evidence shows that the flawed AI script had real consequences. Investigations by ProPublica revealed at least two dozen contracts from the AI’s list had already been canceled. Among these were contracts for maintaining gene sequencing equipment used in cancer research, analyzing blood samples for clinical studies, and providing tools to assess the quality of nursing care.
VA staff reported being under intense pressure to defend the importance of certain contracts, with some given just hours to justify continued funding. In at least one case, internal emails instructed staff to respond in 255 characters or fewer—shorter than a tweet—raising concerns about the seriousness of the review process.
While officials claim that around 600 contracts have been canceled so far, they have refused to provide a detailed breakdown. Congressional Democrats have demanded answers but have been met with silence.
An Inexperienced Developer at the Helm
Lavingia, best known for founding the small e-commerce platform Gumroad, had no prior experience with federal agencies or artificial intelligence systems of this complexity. He admitted that he wrote the first version of the “munching” script on his second day at DOGE and relied on ChatGPT-like tools to help generate the code.
He later told ProPublica that he would never recommend blindly following his tool’s outputs. “It’s like that episode of The Office where Steve Carell drives into a lake because the GPS told him to,” Lavingia said. “Do not drive into the lake.”
The tool analyzed only the first 2,500 words of each contract—typically just the summary—meaning it often overlooked essential details. It also misidentified dollar amounts and other key data due to flawed instructions and limitations of the outdated AI models it used.
Experts Warn of Irresponsible AI Use
After reviewing the AI tool and its code, numerous procurement and AI specialists criticized its deployment. Cary Coglianese, a University of Pennsylvania law and political science professor, warned that large language models (LLMs) are not reliable for high-stakes, complex decisions like those involving veteran services. “These off-the-shelf tools can’t be trusted for this type of analysis,” he said.
Waldo Jaquith, a former Obama administration official who oversaw IT contracting at the Treasury Department, called the approach “absolutely the wrong use of AI.” He added, “AI gives answers that sound right but are often completely wrong. This work demands human expertise.”
Another concern was that Lavingia’s prompts did not include vital context about the VA, such as which services were legally required or how certain roles support broader healthcare systems. As a result, the AI labeled contracts supporting vital backend operations as unnecessary, including parts of the VA’s core procurement system.
DOGE Defends Program, Musk Approved Code Release
Despite the backlash, VA Press Secretary Pete Kasperowicz defended the project. “This sort of review has never been done before, but we’re proud to set this precedent,” he said. He maintained that final decisions were made by experienced VA contracting officials and that the department is simply working to eliminate duplicative or wasteful spending.
However, the logic behind using AI to determine what services to cut has been called into question, especially considering President Trump’s broader plan to slash the VA workforce by 80,000. Critics argue that removing both staff and external contracts without clear plans for continuity of care puts veterans at serious risk. VA staff already report that even minor cuts have disrupted patient services in hospitals and clinics.
In an attempt to promote transparency, Lavingia later published the AI tool’s code to his GitHub page. He said this was done with the approval of Elon Musk, who was overseeing DOGE at the time. “I thought it would be cool if the public could see how these decisions were being made,” Lavingia wrote in a blog post.
That decision may have cost him his job. Lavingia confirmed he was terminated after speaking with Fast Company about his work. The VA declined to comment on his dismissal.
AI’s Role in Government Under Scrutiny
The future of the “munchable” tool remains uncertain, but internal VA documents suggest the administration may expand its use of AI to automate other parts of the agency, including the processing of benefits claims. This direction has raised alarms about the potential replacement of trained staff with flawed automation systems.
Following the release of the script, Lavingia received messages from VA contractors who had seen the code and wanted to understand how to protect their contracts. “They just wanted to know what got them cut—or how not to get cut in the future,” he said.